Applications Archives – CERN Courier https://cerncourier.com/c/applications/ Reporting on international high-energy physics Tue, 08 Jul 2025 19:35:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://cerncourier.com/wp-content/uploads/2025/03/cropped-favicon-32x32.png Applications Archives – CERN Courier https://cerncourier.com/c/applications/ 32 32 Double plasma progress at DESY https://cerncourier.com/a/double-plasma-progress-at-desy/ Tue, 08 Jul 2025 19:33:57 +0000 https://cerncourier.com/?p=113556 New developments tackle two of the biggest challenges in plasma-wave acceleration: beam quality and bunch rate.

The post Double plasma progress at DESY appeared first on CERN Courier.

]]>
What if, instead of using tonnes of metal to accelerate electrons, they were to “surf” on a wave of charge displacements in a plasma? This question, posed in 1979 by Toshiki Tajima and John Dawson, planted the seed for plasma wakefield acceleration (PWA). Scientists at DESY now report some of the first signs that PWA is ready to compete with traditional accelerators at low energies. The results tackle two of the biggest challenges in PWA: beam quality and bunch rate.

“We have made great progress in the field of plasma acceleration,” says Andreas Maier, DESY’s lead scientist for plasma acceleration, “but this is an endeavour that has only just started, and we still have a bit of homework to do to get the system integrated with the injector complexes of a synchrotron, which is our final goal.”

Riding a wave

PWA has the potential to radically miniaturise particle accelerators. Plasma waves are generated when a laser pulse or particle beam ploughs through a millimetres-long hydrogen-filled capillary, displacing electrons and creating a wake of alternating positive and negative charge regions behind it. The process is akin to flotsam and jetsam being accelerated in the wake of a speedboat, and the plasma “wakefields” can be thousands of times stronger than the electric fields in conventional accelerators, allowing particles to gain hundreds of MeV in just a few millimetres. But beam quality and intensity are significant challenges in such narrow confines.

In a first study, a team from the LUX experiment at DESY and the University of Hamburg demonstrated, for the first time, a two-stage correction system to dramatically reduce the energy spread of accelerated electron beams. The first stage stretches the longitudinal extent of the beam from a few femtoseconds to several picoseconds using a series of four zigzagging bending magnets called a magnetic chicane. Next, a radio-frequency cavity reduces the energy variation to below 0.1%, bringing the beam quality in line with conventional accelerators.

“We basically trade beam current for energy stability,” explains Paul Winkler, lead author of a recent publication on active energy compression. “But for the intended application of a synchrotron injector, we would need to stretch the electron bunches anyway. As a result, we achieved performance levels so far only associated with conventional accelerators.”

But producing high-quality beams is only half the battle. To make laser-driven PWA a practical proposition, bunches must be accelerated not just once a second, like at LUX, but hundreds or thousands of times per second. This has now been demonstrated by KALDERA, DESY’s new high-power laser system (see “Beam quality and bunch rate” image).

“Already, on the first try, we were able to accelerate 100 electron bunches per second,” says principal investigator Manuel Kirchen, who emphasises the complementarity of the two advances. The team now plans to scale up the energy and deploy “active stabilisation” to improve beam quality. “The next major goal is to demonstrate that we can contin­uously run the plasma accelerators with high stability,” he says.

With the exception of CERN’s AWAKE experiment (CERN Courier May/June 2024 p25), almost all plasma-wakefield accelerators are designed with medical or industrial applications in mind. Medical applications are particularly promising as they require lower beam energies and place less demanding constraints on beam quality. Advances such as those reported by LUX and KALDERA raise confidence in this new technology and could eventually open the door to cheaper and more portable X-ray equipment, allowing medical imaging and cancer therapy to take place in university labs and hospitals.

The post Double plasma progress at DESY appeared first on CERN Courier.

]]>
News New developments tackle two of the biggest challenges in plasma-wave acceleration: beam quality and bunch rate. https://cerncourier.com/wp-content/uploads/2025/07/CCJulAug25_NA_desy.jpg
Powering into the future https://cerncourier.com/a/powering-into-the-future/ Mon, 19 May 2025 07:55:18 +0000 https://cerncourier.com/?p=113089 Nuria Catalan Lasheras and Igor Syratchev explain why klystrons are strategically important to the future of the field – and how CERN plans to boost their efficiency above 90%.

The post Powering into the future appeared first on CERN Courier.

]]>
The Higgs boson is the most intriguing and unusual object yet discovered by fundamental science. There is no higher experimental priority for particle physics than building an electron–positron collider to produce it copiously and study it precisely. Given the importance of energy efficiency and cost effectiveness in the current geopolitical context, this gives unique strategic importance to developing a humble technology called the klystron – a technology that will consume the majority of site power at every major electron–positron collider under consideration, but which has historically only achieved 60% energy efficiency.

The klystron was invented in 1937 by two American brothers, Russell and Sigurd Varian. The Varians wanted to improve aircraft radar systems. At the time, there was a growing need for better high-frequency amplification to detect objects at a distance using radar, a critical technology in the lead-up to World War II.

The Varian’s RF source operated around 3.2 GHz, or a wavelength of about 9.4 cm, in the microwave region of the electromagnetic spectrum. At the time, this was an extraordinarily high frequency – conventional vacuum tubes struggled beyond 300 MHz. Microwave wavelengths promised better resolution, less noise, and the ability to penetrate rain and fog. Crucially, antennas could be small enough to fit on ships and planes. But the source was far too weak for radar.

Klystrons are ubiquitous in medical, industrial and research accelerators – and not least in the next generation of Higgs factories

The Varians’ genius was to invent a way to amplify the electromagnetic signal by up to 30 dB, or a factor of 1000. The US and British military used the klystron for airborne radar, submarine detection of U-boats in the Atlantic and naval gun targeting beyond visual range. Radar helped win the Battle of Britain, the Battle of the Atlantic and Pacific naval battles, making surprise attacks harder by giving advance warning. Winston Churchill called radar “the secret weapon of WWII”, and the klystron was one of its enabling technologies.

With its high gain and narrow bandwidth, the klystron was the first practical microwave amplifier and became foundational in radio-frequency (RF) technology. This was the first time anyone had efficiently amplified microwaves with stability and directionality. Klystrons have since been used in satellite communication, broadcasting and particle accelerators, where they power the resonant RF cavities that accelerate the beams. Klystrons are therefore ubiquitous in medical, industrial and research accelerators – and not least in the next generation of Higgs factories, which are central to the future of high-energy physics.

Klystrons and the Higgs

Hadron colliders like the LHC tend to be circular. Their fundamental energy limit is given by the maximum strength of the bending magnets and the circumference of the tunnel. A handful of RF cavities repeatedly accelerate beams of protons or ions after hundreds or thousands of bending magnets force the beams to loop back through them.

Operating principle

Thanks to their clean and precisely controllable collisions, all Higgs factories under consideration are electron–positron colliders. Electron–positron colliders can be either circular or linear in construction. The dynamics of circular electron–positron colliders are radically different as the particles are 2000 times lighter than protons. The strength required from the bending magnets is relatively low for any practical circumference, however, the energy of the particles must be continually replenished, as they radiate away energy in the bends through synchrotron radiation, requiring hundreds of RF cavities. RF cavities are equally important in the linear case. Here, all the energy must be imparted in a single pass, with each cavity accelerating the beam only once, requiring either hundreds or even thousands of RF cavities.

Either way, 50 to 60% of the total energy consumed by an electron-positron collider is used for RF acceleration, compared to a relatively small fraction in a hadron collider. Efficiently powering the RF cavities is of paramount importance to the energy efficiency and cost effectiveness of the facility as a whole. RF acceleration is therefore of far greater significance at electron–positron colliders than at hadron colliders.

From a pen to a mid-size car

RF cavities cannot simply be plugged into the wall. These finely tuned resonant structures must be excited by RF power – an alternating microwave electromagnetic field that is supplied through waveguides at the appropriate frequency. Due to the geometry of resonant cavities, this excites an on-axis oscillating electrical field. Particles that arrive when the electrical field has the right direction are accelerated. For this reason, particles in an accelerator travel in bunches separated by a long distance, during which the RF field is not optimised for acceleration.

CLIC klystron

Despite the development of modern solid-state amplifiers, the Varians’ klystron is still the most practical technology to generate RF when the power required is in the MW level. They can be as small as a pen or as large and heavy as a mid-size car, depending on the frequency and power required. Linear colliders use higher frequency because they also come with higher gradients and make the linac shorter, whereas a circular collider does not need high gradients as the energy to be given each turn is smaller.

Klystrons fall under the general classification of vacuum tubes – fully enclosed miniature electron accelerators with their own source, accelerating path and “interaction region” where the RF field is produced. Their name is derived from the Greek verb describing the action of waves crashing against the seashore. In a klystron, RF power is generated when electrons crash against a decelerating electric field.

Every klystron contains at least two cavities: an input and an output. The input cavity is powered by a weak RF source that must be amplified. The output cavity generates the strongly amplified RF signal generated by the klystron. All this comes encapsulated in an ultra-high vacuum volume inside the field of a solenoid for focusing (see “Operating principle” figure).

Thanks to the efforts made in recent years, high-efficiency klystrons are now approaching the ultimate theoretical limit

Inside the klystron, electrons leave a heated cathode and are accelerated by a high voltage applied between the cathode and the anode. As they are being pushed forward, a small input RF signal is applied to the input cavity, either accelerating or decelerating the electrons according to their time of arrival. After a long drift, late-emitted accelerated electrons catch up with early-emitted decelerated electrons, intersecting with those that did not see any net accelerating force. This is called velocity bunching.

A second, passive accelerating cavity is placed at the location where maximum bunching occurs. Though of a comparable design, this cavity behaves in an inverse fashion to those used in particle accelerators. Rather than converting the energy of an electromagnetic field into the kinetic energy of particles, the kinetic energy of particles is converted into RF electromagnetic waves. This process can be enhanced by the presence of other passive cavities in between the already mentioned two, as well as by several iterations of bunching and de-bunching before reaching the output cavity. Once decelerated, the spent beam finishes its life in a dump or a water-cooled collector.

Optimising efficiency

Klystrons are ultimately RF amplifiers with a very high gain of the order of 30 to 60 dB and a very narrow bandwidth. They can be built at any frequency from a few hundred MHz to tens of GHz, but each operates within a very small range of frequencies called the bandwidth. After broadcasting became reliant on wider bandwidth vacuum tubes, their application in particle accelerators turned into a small market for high-power klystrons. Most klystrons for science are manufactured by a handful of companies which offer a limited number of models that have been in operation for decades. Their frequency, power and duty cycle may not correspond to the specifications of a new accelerator being considered – and in most cases, little or no thought has been given to energy efficiency or carbon footprint.

Battling space charge

When searching for suitable solutions for the next particle-physics collider, however, optimising the energy efficiency of klystrons and other devices that will determine the final energy bill and CO2 emissions is a task of the utmost importance. Therefore, nearly a decade ago, RF experts at CERN and the University of Lancaster began the High-Efficiency Klystron (HEK) project to maximise beam-to-RF efficiency: the fraction of the power contained in the klystron’s electron beam that is converted into RF power by the output cavity.

The complexity of klystrons resides on the very nonlinear fields to which the electrons are subjected. In the cathode and the first stages of electrostatic acceleration, the collective effect of “space-charge” forces between the electrons determines the strongly nonlinear dynamics of the beam. The same is true when the bunching tightens along the tube, with mutual repulsion between the electrons preventing optimal bunching at the output cavity.

For this reason, designing klystrons is not susceptible to simple analytical calculations. Since 2017, CERN has developed a code called KlyC that simulates the beam along the klystron channel and optimises parameters such as frequency and distance between cavities 100 to 1000 times faster than commercial 3D codes. KlyC is available in the public domain and is being used by an ever-growing list of labs and industrial partners.

Perveance

The main characteristic of a klystron is an obscure magnitude inherited from electron-gun design called perveance. For small perveances, space-charge forces are small, due to either high energy or low intensity, making bunching easy. For large perveances, space-charge forces oppose bunching, lowering beam-to-RF efficiency. High-power klystrons require large currents and therefore high perveances. One way to produce highly efficient, high-power klystrons is therefore for multiple cathodes to generate multiple low-perveance electron beams in a “multi-beam” (MB) klystron.

High-luminosity gains

Overall, there is an almost linear dependence between perveance and efficiency. Thanks to the efforts made in recent years, high-efficiency klystrons are now outperforming industrial klystrons by 10% in efficiency for all values of perveance, and approaching the ultimate theoretical limit (see “Battling space charge” figure).

One of the first designs to be brought to life was based on the E37113, a pulsed klystron with 6 MW peak power working in the X-band at 12 GHz, commercialised by CANON ETD. This klystron is currently used in the test facility at CERN for validating CLIC RF prototypes, which could greatly benefit from a larger power. As part of a collaboration with CERN, CANON ETD built a new tube, according to the design optimised at CERN, to reach a beam-to-RF efficiency of 57% instead of the original 42% (see “CLIC klystron” image and CERN Courier September/October 2022 p9).

As its interfaces with the high-voltage (HV) source and solenoid were kept identical, one can now benefit from 8 MW of RF power for the same energy consumption as before. As changes in the manufacturing of the tube channel are just a small fraction of the manufacture of the instrument, its price should not increase considerably, even if more accurate production methods are required.

In pursuit of power

Towards an FCC klystron

Another successful example of re-designing a tube for high efficiency is the TH2167 – the klystron behind the LHC, which is manufactured by Thales. Originally exhibiting a beam-to-RF efficiency of 60%, it was re-designed by the CERN team to gain 10% and reach 70% efficiency, while again using the same HV source and solenoid. The tube prototype has been built and is currently at CERN, where it has demonstrated the capacity to generate 350 kW of RF power with the same input energy as previously required to produce 300 kW. This power will be decisive when dealing with the higher intensity beam expected after the LHC luminosity upgrade. And all this again for a price comparable to previous models (see “High-luminosity gains” image).

The quest for the highest efficiency is not over yet. The CERN team is currently working on a design that could power the proposed Future Circular collider (FCC). Using about a hundred accelerating cavities, the electron and positron beams will need to be replenished with 100 MW of RF power, and energy efficiency is imperative.

The quest for the highest efficiency is not over yet

Although the same tube in use for the LHC, now boosted to 70% efficiency, could be used to power the FCC, CERN is working towards a vacuum tube that could reach an efficiency over 80%. A two-stage multi-beam klystron was initially designed that was capable of reaching 86% efficiency and generating 1 MW of continuous-wave power (see “Towards an FCC klystron” figure).

Motivated by recent changes in FCC parameters, we have rediscovered an old device called a tristron, which is not a conventional klystron but a “gridded tube” where the electron beam bunching mechanism is different. Tristons have a lower power gain but much greater flexibility. Simulations have confirmed that they can reach efficiencies as high as 90%. This could be a disruptive technology with applications well beyond accelerators. Manufacturing a prototype is an excellent opportunity for knowledge transfer from fundamental research to industrial applications.

The post Powering into the future appeared first on CERN Courier.

]]>
Feature Nuria Catalan Lasheras and Igor Syratchev explain why klystrons are strategically important to the future of the field – and how CERN plans to boost their efficiency above 90%. https://cerncourier.com/wp-content/uploads/2025/05/CCMayJun25_KLYSTRONS_frontis.jpg
CERN and ESA: a decade of innovation https://cerncourier.com/a/cern-and-esa-a-decade-of-innovation/ Mon, 27 Jan 2025 07:59:01 +0000 https://cerncourier.com/?p=112108 Enrico Chesta, Véronique Ferlet-Cavrois and Markus Brugger highlight seven ways CERN and ESA are working together to further fundamental exploration and innovation in space technologies.

The post CERN and ESA: a decade of innovation appeared first on CERN Courier.

]]>
Sky maps

Particle accelerators and spacecraft both operate in harsh radiation environments, extreme temperatures and high vacuum. Each must process large amounts of data quickly and autonomously. Much can be gained from cooperation between scientists and engineers in each field.

Ten years ago, the European Space Agency (ESA) and CERN signed a bilateral cooperation agreement to share expertise and facilities. The goal was to expand the limits of human knowledge and keep Europe at the leading edge of progress, innovation and growth. A decade on, CERN and ESA have collaborated on projects ranging from cosmology and planetary exploration to Earth observation and human spaceflight, supporting new space-tech ventures and developing electronic systems, radiation-monitoring instruments and irradiation facilities.

1. Mapping the universe

The Euclid space telescope is exploring the dark universe by mapping the large-scale structure of billions of galaxies out to 10 billion light-years across more than a third of the sky. With tens of petabytes expected in its final data set – already a substantial reduction of the 850 billion bits of compressed images Euclid processes each day – it will generate more data than any other ESA mission by far.

With many CERN cosmologists involved in testing theories of beyond-the-Standard-Model physics, Euclid first became a CERN-recognised experiment in 2015. CERN also contributes to the development of Euclid’s “science ground segment” (SGS), which processes raw data received from the Euclid spacecraft into usable scientific products such as galaxy catalogues and dark-matter maps. CERN’s virtual-machine file system (CernVM-FS) has been integrated into the SGS to allow continuous software deployment across Euclid’s nine data centres and on developers’ laptops.

The telescope was launched in July 2023 and began observations in February 2024. The first piece of its great map of the universe was released in October 2024, showing millions of stars and galaxies from observations and covering 132 square degrees of the southern sky (see “Sky map” figure). Based on just two weeks of observations, it accounts for just 1% of project’s six-year survey, which will be the largest cosmic map ever made.

Future CERN–ESA collaborations on cosmology, astrophysics and multimessenger astronomy are likely to include the Laser Interferometer Space Antenna (LISA) and the NewAthena X-ray observatory. LISA will be the first space-based observatory to study gravitational waves. NewAthena will study the most energetic phenomena in the universe. Both projects are expected to be ready to launch about 10 years from now.

2. Planetary exploration

Though planetary exploration is conceptually far from fundamental physics, its technical demands require similar expertise. A good example is the Jupiter Icy Moons Explorer (JUICE) mission, which will make detailed observations of the gas giant and its three large ocean-bearing moons Ganymede, Callisto and Europa.

Jupiter’s magnetic field is a million times greater in volume than Earth’s magnetosphere, trapping large fluxes of highly energetic electrons and protons. Before JUICE, the direct and indirect impact of high-energy electrons on modern electronic devices, and in particular their ability to cause “single event effects”, had never been studied before. Two test campaigns took place in the VESPER facility, which is part of the CERN Linear Electron Accelerator for Research (CLEAR) project. Components were tested with tuneable beam energies between 60 and 200 MeV, and average fluxes of roughly 108 electrons per square centimetre per second, mirroring expected radiation levels in the Jovian system.

JUICE radiation-monitor measurements

JUICE was successfully launched in April 2023, starting an epic eight-year journey to Jupiter including several flyby manoeuvres that will be used to commission the onboard instruments (see “Flyby” figure). JUICE should reach Jupiter in July 2031. It remains to be seen whether test results obtained at CERN have successfully de-risked the mission.

Another interesting example of cooperation on planetary exploration is the Mars Sample Return mission, which must operate in low temperatures during eclipse phases. CERN supported the main industrial partner, Thales Alenia Space, in qualifying the orbiter’s thermal-protection systems in cryogenic conditions.

3. Earth observation

Earth observation from orbit has applications ranging from environmental monitoring to weather forecasting. CERN and ESA collaborate both on developing the advanced technologies required by these applications and ensuring they can operate in the harsh radiation environment of space.

In 2017 and 2018, ESA teams came to CERN’s North Area with several partner companies to test the performance of radiation monitors, field-programmable gate arrays (FPGAs) and electronics chips in ultra-high-energy ion beams at the Super Proton Synchrotron. The tests mimicked the ultra-high-energy part of the galactic cosmic-ray spectrum, whose effects had never previously been measured on the ground beyond 10 GeV/nucleon. In 2017, ESA’s standard radiation-environment monitor and several FPGAs and multiprocessor chips were tested with xenon ions. In 2018, the highlight of the campaign was the testing of Intel’s Myriad-2 artificial intelligence (AI) chip with lead ions (see “Space AI” figure). Following its radiation characterisation and qualification, in 2020 the chip embarked on the φ-sat-1 mission to autonomously detect clouds using images from a hyperspectral camera.

Myriad 2 chip testing

More recently, CERN joined Edge SpAIce – an EU project to monitor ecosystems onboard the Balkan-1 satellite and track plastic pollution in the oceans. The project will use CERN’s high-level synthesis for machine learning (hls4ml) AI technology to run inference models on an FPGA that will be launched in 2025.

Looking further ahead, ESA’s φ-lab and CERN’s Quantum Technology Initiative are sponsoring two PhD programmes to study the potential of quantum machine learning, generative models and time-series processing to advance Earth observation. Applications may accelerate the task of extracting features from images to monitor natural disasters, deforestation and the impact of environmental effects on the lifecycle of crops.

4. Dosimetry for human spaceflight

In space, nothing is more important than astronauts’ safety and wellbeing. To this end, in August 2021 ESA astronaut Thomas Pesquet activated the LUMINA experiment inside the International Space Station (ISS), as part of the ALPHA mission (see “Space dosimetry” figure). Developed under the coordination of the French Space Agency and the Laboratoire Hubert Curien at the Université Jean-Monnet-Saint-Étienne and iXblue, LUMINA uses two several-kilometre-long phosphorous-doped optical fibres as active dosimeters to measure ionising radiation aboard the ISS.

ESA astronaut Thomas Pesquet

When exposed to radiation, optical fibres experience a partial loss of transmitted power. Using a reference control channel, radiation-induced attenuation can be accurately measured related to the total ionising dose, with the sensitivity of the device primarily governed by the length of the fibre. Having studied optical-fibre-based technologies for many years, CERN helped optimise the architecture of the dosimeters and performed irradiation tests to calibrate the instrument, which will operate on the ISS for a period of up to five years.

LUMINA complements dosimetry measurements performed on the ISS using CERN’s Timepix technology – an offshoot of the hybrid-pixel-detector technology developed for the LHC experiments (CERN Courier September/October 2024 p37). Timepix dosimeters have been integrated in multiple NASA payloads since 2012.

5. Radiation-hardness assurance

It’s no mean feat to ensure that CERN’s accelerator infrastructure functions in increasingly challenging radiation environments. Similar challenges are found in space. Damage can be caused by accumulating ionising doses, single-event effects (SEEs) or so-called displacement damage dose, which dislodges atoms within a material’s crystal lattice rather than ionising them. Radiation-hardness assurance (RHA) reduces radiation-induced failures in space through environment simulations, part selection and testing, radiation-tolerant design, worst-case analysis and shielding definition.

Since its creation in 2008, CERN’s Radiation to Electronics project has amplified the work of many equipment and service groups in modelling, mitigating and testing the effect of radiation on electronics. A decade later, joint test campaigns with ESA demonstrated the value of CERN’s facilities and expertise to RHA for spaceflight. This led to the signing of a joint protocol on radiation environments, technologies and facilities in 2019, which also included radiation detectors and radiation-tolerant systems, and components and simulation tools.

CHARM facility

Among CERN’s facilities is CHARM: the CERN high-energy-accelerator mixed-field facility, which offers an innovative approach to low-cost RHA. CHARM’s radiation field is generated by the interaction between a 24 GeV/c beam from the Proton Synchrotron and a metallic target. CHARM offers a uniquely wide spectrum of radiation types and energies, the possibility to adjust the environment using mobile shielding, and enough space to test a medium-sized satellite in full operating conditions.

Radiation testing is particularly challenging for the new generation of rapidly developed and often privately funded “new space” projects, which frequently make use of commercial and off-the-shelf (COTS) components. Here, RHA relies on testing and mitigation rather than radiation hardening by design. For “flip chip” configurations, which have their active circuitry facing inward toward the substrate, and dense three-dimensional structures that cannot be directly exposed without compromising their performance, heavy-ion beams accelerated to between 10 and 100 MeV/nucleon are the only way to induce SEE in the sensitive semiconductor volumes of the devices.

To enable testing of highly integrated electronic components, ESA supported studies to develop the CHARM heavy ions for micro-electronics reliability-assurance facility – CHIMERA for short (see “CHIMERA” figure). ESA has sponsored key feasibility activities such as: tuning the ion flux in a large dynamic range; tuning the beam size for board-level testing; and reducing beam energy to maximise the frequency of SEE while maintaining a penetration depth of a few millimetres in silicon.

6. In-orbit demonstrators

Weighing 1 kg and measuring just 10 cm on each side – a nanosatellite standard – the CELESTA satellite was designed to study the effects of cosmic radiation on electronics (see “CubeSat” figure). Initiated in partnership with the University of Montpellier and ESA, and launched in July 2022, CELESTA was CERN’s first in-orbit technology demonstrator.

Radiation-testing model of the CELESTA satellite

As well as providing the first opportunity for CHARM to test a full satellite, CELESTA offered the opportunity to flight-qualify SpaceRadMon, which counts single-event upsets (SEUs) and single-event latchups (SELs) in static random-access memory while using a field-effect transistor for dose monitoring. (SEUs are temporary errors caused by a high-energy particle flipping a bit and SELs are short circuits induced by high-energy particles.) More than 30 students contributed to the mission development, partially in the frame of ESA’s Fly Your Satellite Programme. Built from COTS components calibrated in CHARM, SpaceRadMon has since been adopted by other ESA missions such as Trisat and GENA-OT, and could be used in the future as a low-cost predictive maintenance tool to reduce space debris and improve space sustainability.

The maiden flight of the Vega-C launcher placed CELESTA on an atypical quasi-circular medium-Earth orbit in the middle of the inner Van Allen proton belt at roughly 6000 km. Two months of flight data sufficed to validate the performance of the payload and the ground-testing procedure in CHARM, though CELESTA will fly for thousands of years in a region of space where debris is not a problem due to the harsh radiation environment.

The CELESTA approach has since been adopted by industrial partners to develop radiation-tolerant cameras, radios and on-board computers.

7. Stimulating the space economy

Space technology is a fast-growing industry replete with opportunities for public–private cooperation. The global space economy will be worth $1.8 trillion by 2035, according to the World Economic Forum – up from $630 billion in 2023 and growing at double the projected rate for global GDP.

Whether spun off from space exploration or particle physics, ESA and CERN look to support start-up companies and high-tech ventures in bringing to market technologies with positive societal and economic impacts (see “Spin offs” figure). The use of CERN’s Timepix technology in space missions is a prime example. Private company Advacam collaborated with the Czech Technical University to provide a Timepix-based radiation-monitoring payload called SATRAM to ESA’s Proba-V mission to map land cover and vegetation growth across the entire planet every two days.

The Hannover Messe fair

Advacam is now testing a pixel-detector instrument on JoeySat – an ESA-sponsored technology demonstrator for OneWeb’s next-generation constellation of satellites designed to expand global connectivity. Advacam is also working with ESA on radiation monitors for Space Rider and NASA’s Lunar Gateway. Space Rider is a reusable spacecraft whose maiden voyage is scheduled for the coming years, and Lunar Gateway is a planned space station in lunar orbit that could act as a staging post for Mars exploration.

Another promising example is SigmaLabs – a Polish startup founded by CERN alumni specialising in radiation detectors and predictive-maintenance R&D for space applications. SigmaLabs was recently selected by ESA and the Polish Space Agency to provide one of the experiments expected to fly on Axiom Mission 4 – a private spaceflight to the ISS in 2025 that will include Polish astronaut and CERN engineer Sławosz Uznański (CERN Courier May/June 2024 p55). The experiment will assess the scalability and versatility of the SpaceRadMon radiation-monitoring technology initially developed at CERN for the LHC and flight tested on the CELESTA CubeSat.

In radiation-hardness assurance, the CHIMERA facility is associated with the High-Energy Accelerators for Radiation Testing and Shielding (HEARTS) programme sponsored by the European Commission. Its 2024 pilot user run is already stimulating private innovation, with high-energy heavy ions used to perform business-critical research on electronic components for a dozen aerospace companies.

The post CERN and ESA: a decade of innovation appeared first on CERN Courier.

]]>
Feature Enrico Chesta, Véronique Ferlet-Cavrois and Markus Brugger highlight seven ways CERN and ESA are working together to further fundamental exploration and innovation in space technologies. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_CERNandESA_pesquet.jpg
A word with CERN’s next Director-General https://cerncourier.com/a/a-word-with-cerns-next-director-general/ Mon, 27 Jan 2025 07:56:07 +0000 https://cerncourier.com/?p=112181 Mark Thomson, CERN's Director General designate for 2025, talks to the Courier about the future of particle physics.

The post A word with CERN’s next Director-General appeared first on CERN Courier.

]]>
Mark Thomson

What motivates you to be CERN’s next Director-General?

CERN is an incredibly important organisation. I believe my deep passion for particle physics, coupled with the experience I have accumulated in recent years, including leading the Deep Underground Neutrino Experiment, DUNE, through a formative phase, and running the Science and Technology Facilities Council in the UK, has equipped me with the right skill set to lead CERN though a particularly important period.

How would you describe your management style?

That’s a good question. My overarching approach is built around delegating and trusting my team. This has two advantages. First, it builds an empowering culture, which in my experience provides the right environment for people to thrive. Second, it frees me up to focus on strategic planning and engagement with numerous key stakeholders. I like to focus on transparency and openness, to build trust both internally and externally.

How will you spend your familiarisation year before you take over in 2026?

First, by getting a deep understanding of CERN “from within”, to plan how I want to approach my mandate. Second, by lending my voice to the scientific discussion that will underpin the third update to the European strategy for particle physics. The European strategy process is a key opportunity for the particle-physics community to provide genuine bottom-up input and shape the future. This is going to be a really varied and exciting year.

What open question in fundamental physics would you most like to see answered in your lifetime?

I am going to have to pick two. I would really like to understand the nature of dark matter. There are a wide range of possibilities, and we are addressing this question from multiple angles; the search for dark matter is an area where the collider and non-collider experiments can both contribute enormously. The second question is the nature of the Higgs field. The Higgs boson is just so different from anything else we’ve ever seen. It’s not just unique – it’s unique and very strange. There are just so many deep questions, such as whether it is fundamental or composite. I am confident that we will make progress in the coming years. I believe the High-Luminosity LHC will be able to make meaningful measurements of the self-coupling at the heart of the Higgs potential. If you’d asked me five years ago whether this was possible, I would have been doubtful. But today I am very optimistic because of the rapid progress with advanced analysis techniques being developed by the brilliant scientists on the LHC experiments.

What areas of R&D are most in need of innovation to meet our science goals?

Artificial intelligence is changing how we look at data in all areas of science. Particle physics is the ideal testing ground for artificial intelligence, because our data is complex there are none of the issues around the sensitive nature of the data that exist in other fields. Complex multidimensional datasets are where you’ll benefit the most from artificial intelligence. I’m also excited by the emergence of new quantum technologies, which will open up fresh opportunities for our detector systems and also new ways of doing experiments in fundamental physics. We’ve only scratched the surface of what can be achieved with entangled quantum systems.

How about in accelerator R&D?

There are two areas that I would like to highlight: making our current technologies more sustainable, and the development of high-field magnets based on high-temperature superconductivity. This connects to the question of innovation more broadly. To quote one example among many, high-temperature superconducting magnets are likely to be an important component of fusion reactors just as much as particle accelerators, making this a very exciting area where CERN can deploy its engineering expertise and really push that programme forward. That’s not just a benefit for particle physics, but a benefit for wider society.

How has CERN changed since you were a fellow back in 1994?

The biggest change is that the collider experiments are larger and more complex, and the scientific and technical skills required have become more specialised. When I first came to CERN, I worked on the OPAL experiment at LEP – a collaboration of less than 400 people. Everybody knew everybody, and it was relatively easy to understand the science of the whole experiment.

My overarching approach is built around delegating and trusting my team

But I don’t think the scientific culture of CERN and the particle-physics community has changed much. When I visit CERN and meet with the younger scientists, I see the same levels of excitement and enthusiasm. People are driven by the wonderful mission of discovery. When planning the future, we need to ensure that early-career researchers can see a clear way forward with opportunities in all periods of their career. This is essential for the long-term health of particle physics. Today we have an amazing machine that’s running beautifully: the LHC. I also don’t think it is possible to overstate the excitement of the High-Luminosity LHC. So there’s a clear and exciting future out to the early 2040s for today’s early-career researchers. The question is what happens beyond that? This is one reason to ensure that there is not a large gap between the end of the High-Luminosity LHC and the start of whatever comes next.

Should the world be aligning on a single project?

Given the increasing scale of investment, we do have to focus as a global community, but that doesn’t necessarily mean a single project. We saw something similar about 10 years ago when the global neutrino community decided to focus its efforts on two complementary long-baseline projects, DUNE and Hyper-Kamiokande. From the perspective of today’s European strategy, the Future Circular Collider (FCC) is an extremely appealing project that would map out an exciting future for CERN for many decades. I think we’ll see this come through strongly in an open and science-driven European strategy process.

How do you see the scientific case for the FCC?

For me, there are two key points. First, gaining a deep understanding of the Higgs boson is the natural next step in our field. We have discovered something truly unique, and we should now explore its properties to gain deeper insights into fundamental physics. Scientifically, the FCC provides everything you want from a Higgs factory, both in terms of luminosity and the opportunity to support multiple experiments.

Second, investment in the FCC tunnel will provide a route to hadron–hadron collisions at the 100 TeV scale. I find it difficult to foresee a future where we will not want this capability.

These two aspects make the FCC a very attractive proposition.

How successful do you believe particle physics is in communicating science and societal impacts to the public and to policymakers?

I think we communicate science well. After all, we’ve got a great story. People get the idea that we work to understand the universe at its most basic level. It’s a simple and profound message.

Going beyond the science, the way we communicate the wider industrial and societal impact is probably equally important. Here we also have a good story. In our experiments we are always pushing beyond the limits of current technology, doing things that have not been done before. The technologies we develop to do this almost always find their way back into something that will have wider applications. Of course, when we start, we don’t know what the impact will be. That’s the strength and beauty of pushing the boundaries of technology for science.

Would the FCC give a strong return on investment to the member states?

Absolutely. Part of the return is the science, part is the investment in technology, and we should not underestimate the importance of the training opportunities for young people across Europe. CERN provides such an amazing and inspiring environment for young people. The scale of the FCC will provide a huge number of opportunities for young scientists and engineers.

We need to ensure that early-career researchers can see a clear way forward with opportunities in all periods of their career. This is essential for the long-term health of particle physics

In terms of technology development, the detectors for the electron–positron collider will provide an opportunity for pushing forward and deploying new, advanced technologies to deliver the precision required for the science programme. In parallel, the development of the magnet technologies for the future hadron collider will be really exciting, particularly the potential use of high-temperature superconductors, as I said before.

It is always difficult to predict the specific “return on investment” on the technologies for big scientific research infrastructure. Part of this challenge is that some of that benefits might be 20, 30, 40 years down the line. Nevertheless, every retrospective that has tried, has demonstrated that you get a huge downstream benefit.

Do we reward technical innovation well enough in high-energy physics?

There needs to be a bit of a culture shift within our community. Engineering and technology innovation are critical to the future of science and critical to the prosperity of Europe. We should be striving to reward individuals working in these areas.

Should the field make it more flexible for physicists and engineers to work in industry and return to the field having worked there?

This is an important question. I actually think things are changing. The fluidity between academia and industry is increasing in both directions. For example, an early-career researcher in particle physics with a background in deep artificial-intelligence techniques is valued incredibly highly by industry. It also works the other way around, and I experienced this myself in my career when one of my post-doctoral researchers joined from an industry background after a PhD in particle physics. The software skills they picked up from industry were incredibly impactful.

I don’t think there is much we need to do to directly increase flexibility – it’s more about culture change, to recognise that fluidity between industry and academia is important and beneficial. Career trajectories are evolving across many sectors. People move around much more than they did in the past.

Does CERN have a future as a global laboratory?

CERN already is a global laboratory. The amazing range of nationalities working here is both inspiring and a huge benefit to CERN.

How can we open up opportunities in low- and middle-income countries?

I am really passionate about the importance of diversity in all its forms and this includes national and regional inclusivity. It is an agenda that I pursued in my last two positions. At the Deep Underground Neutrino Experiment, I was really keen to engage the scientific community from Latin America, and I believe this has been mutually beneficial. At STFC, we used physics as a way to provide opportunities for people across Africa to gain high-tech skills. Going beyond the training, one of the challenges is to ensure that people use these skills in their home nations. Otherwise, you’re not really helping low- and middle-income countries to develop.

What message would you like to leave with readers?

That we have really only just started the LHC programme. With more than a factor of 10 increase in data to come, coupled with new data tools and upgraded detectors, the High-Luminosity LHC represents a major opportunity for a new discovery. Its nature could be a complete surprise. That’s the whole point of exploring the unknown: you don’t know what’s out there. This alone is incredibly exciting, and it is just a part of CERN’s amazing future.

The post A word with CERN’s next Director-General appeared first on CERN Courier.

]]>
Opinion Mark Thomson, CERN's Director General designate for 2025, talks to the Courier about the future of particle physics. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_INT_thompson_feature.jpg
CLOUD explains Amazon aerosols https://cerncourier.com/a/cloud-explains-amazon-aerosols/ Mon, 27 Jan 2025 07:26:49 +0000 https://cerncourier.com/?p=112200 The CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models.

The post CLOUD explains Amazon aerosols appeared first on CERN Courier.

]]>
In a paper published in the journal Nature, the CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models.

Aerosols are microscopic particles suspended in the atmosphere that arise from both natural sources and human activities. They play an important role in Earth’s climate system because they seed clouds and influence their reflectivity and coverage. Most aerosols arise from the spontaneous condensation of molecules that are present in the atmosphere only in minute concentrations. However, the vapours responsible for their formation are not well understood, particularly in the remote upper troposphere.

The CLOUD (Cosmics Leaving Outdoor Droplets) experiment at CERN is designed to investigate the formation and growth of atmospheric aerosol particles in a controlled laboratory environment. CLOUD comprises a 26 m3 ultra-clean chamber and a suite of advanced instruments that continuously analyse its contents. The chamber contains a precisely selected mixture of gases under atmospheric conditions, into which beams of charged pions are fired from CERN’s Proton Synchrotron to mimic the influence of galactic cosmic rays.

“Large concentrations of aerosol particles have been observed high over the Amazon rainforest for the past 20 years, but their source has remained a puzzle until now,” says CLOUD spokesperson Jasper Kirkby. “Our latest study shows that the source is isoprene emitted by the rainforest and lofted in deep convective clouds to high altitudes, where it is oxidised to form highly condensable vapours. Isoprene represents a vast source of biogenic particles in both the present-day and pre-industrial atmospheres that is currently missing in atmospheric chemistry and climate models.”

Isoprene is a hydrocarbon containing five carbon atoms and eight hydrogen atoms. It is emitted by broad-leaved trees and other vegetation and is the most abundant non-methane hydrocarbon released into the atmosphere. Until now, isoprene’s ability to form new particles has been considered negligible.

Seeding clouds

The CLOUD results change this picture. By studying the reaction of hydroxyl radicals with isoprene at upper tropospheric temperatures of –30 °C and –50 °C, the collaboration discovered that isoprene oxidation products form copious particles at ambient isoprene concentrations. This new source of aerosol particles does not require any additional vapours. However, when minute concentrations of sulphuric acid or iodine oxoacids were introduced into the CLOUD chamber, a 100-fold increase in aerosol formation rate was observed. Although sulphuric acid derives mainly from anthropogenic sulphur dioxide emissions, the acid concentrations used in CLOUD can also arise from natural sources.

In addition, the team found that isoprene oxidation products drive rapid growth of particles to sizes at which they can seed clouds and influence the climate – a behaviour that persists in the presence of nitrogen oxides produced by lightning at upper-tropospheric concentrations. After continued growth and descent to lower altitudes, these particles may provide a globally important source for seeding shallow continental and marine clouds, which influence Earth’s radiative balance – the amount of incoming solar radiation compared to outgoing longwave radiation (see “Seeding clouds” figure).

“This new source of biogenic particles in the upper troposphere may impact estimates of Earth’s climate sensitivity, since it implies that more aerosol particles were produced in the pristine pre-industrial atmosphere than previously thought,” adds Kirkby. “However, until our findings have been evaluated in global climate models, it’s not possible to quantify the effect.”

The CLOUD findings are consistent with aircraft observations over the Amazon, as reported in an accompanying paper in the same issue of Nature. Together, the two papers provide a compelling picture of the importance of isoprene-driven aerosol formation and its relevance for the atmosphere.

Since it began operation in 2009, the CLOUD experiment has unearthed several mechanisms by which aerosol particles form and grow in different regions of Earth’s atmosphere. “In addition to helping climate researchers understand the critical role of aerosols in Earth’s climate, the new CLOUD result demonstrates the rich diversity of CERN’s scientific programme and the power of accelerator-based science to address societal challenges,” says CERN Director for Research and Computing, Joachim Mnich.

The post CLOUD explains Amazon aerosols appeared first on CERN Courier.

]]>
News The CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_NA_cloudfrontis.jpg
The new hackerpreneur https://cerncourier.com/a/the-new-hackerpreneur/ Mon, 27 Jan 2025 07:22:11 +0000 https://cerncourier.com/?p=112258 Hackathons can kick-start your career, says hacker and entrepreneur Jiannan Zhang.

The post The new hackerpreneur appeared first on CERN Courier.

]]>
The World Wide Web, AI and quantum computing – what do these technologies have in common? They all started out as “hacks”, says Jiannan Zhang, founder of the open-source community platform DoraHacks. “When the Web was invented at CERN, it demonstrated that in order to fundamentally change how people live and work, you have to think of new ways to use existing technology,” says Zhang. “Progress cannot be made if you always start from scratch. That’s what hackathons are for.”

Ten years ago, Zhang helped organise the first CERN Webfest, a hackathon that explores creative uses of technology for science and society. Webfest helped Zhang develop his coding skills and knowledge of physics by applying it to something beyond his own discipline. He also made long-lasting connections with teammates, who were from different academic backgrounds and all over the world. After participating in more hackathons, Zhang’s growing “hacker spirit” inspired him to start his own company. In 2024 Zhang returned to Webfest not as a participant, but as the CEO of DoraHacks.

Hackathons are social coding events often spanning multiple days. They are inclusive and open – no academic institution or corporate backing is required – making them accessible to a diverse range of talented individuals. Participants work in teams, pooling their skills to tackle technical problems through software, hardware or a business plan for a new product. Physicists, computer scientists, engineers and entrepreneurs all bring their strengths to the table. Young scientists can pursue work that may not fit within typical research structures, develop their skills, and build portfolios and professional networks.

“If you’re really passionate about some­thing, you should be able to jump on a project and work on it,” says Zhang. “You shouldn’t need to be associated with a university or have a PhD to pursue it.”

For early-career researchers, hackathons offer more than just technical challenges. They provide an alternative entry point into research and industry, bridging the gap between academia and real-world applications. University-run hackathons often attract corporate sponsors, giving them the budget to rent out stadiums with hundreds, sometimes thousands, of attendees.

“These large-scale hackathons really capture the attention of headhunters and mentors from industry,” explains Zhang. “They see the events as a recruitment pool. It can be a really effective way to advance careers and speak to representatives of big companies, as well as enhancing your coding skills.”

In the 2010s, weekend hackathons served as Zhang’s stepping stone into entrepreneurship. “I used to sit in the computer-science common room and work on my hacks. That’s how I met most of my friends,” recalled Zhang. “But later I realised that to build something great, I had to effectively organise people and capital. So I started to skip my computer-science classes and sneak into the business classrooms.” Zhang would hide in the back row of the business lectures, plotting his plan towards entrepreneurship. He networked with peers to evaluate different business models each day. “It was fun to combine our knowledge of engineering and business theory,” he added. “It made the journey a lot less stressful.”

But the transition from science to entrepreneurship was hard. “At the start you must learn and do everything yourself. The good thing is you’re exposed to lots of new skills and new people, but you also have to force yourself to do things you’re not usually good at.”

This is a dilemma many entrepreneurs face: whether to learn new skills from scratch, or to find business partners and delegate tasks. But finding trustworthy business partners is not always easy, and making the wrong decision can hinder the start up’s progress. That’s why planning the company’s vision and mission from the start is so important.

“The solution is actually pretty straight forward,” says Zhang. “You need to spend more time completing the important milestones yourself, to ensure you have a feasible product. Once you make the business plan and vision clear, you get support from everywhere.”

Decentralised community governance

Rather than hackathon participants competing for a week before abandoning their code, Zhang started DoraHacks to give teams from all over the world a chance to turn their ideas into fully developed products. “I want hackathons to be more than a recruitment tool,” he explains. “They should foster open-source development and decentralised community governance. Today, a hacker from Tanzania can collaborate virtually with a team in the US, and teams gain support to develop real products. This helps make tech fields much more diverse and accessible.”

Zhang’s company enables this by reducing logistical costs for organisers and providing funding mechanisms for participants, making hackathons accessible to aspiring researchers beyond academic institutions. As the community expands, new doors open for young scientists at the start of their careers.

“The business model is changing,” says Zhang. Hackathons are becoming fundamental to emerging technologies, particularly in areas like quantum computing, blockchain and AI, which often start out open source. “There will be a major shift in the process of product creation. Instead of building products in isolation, new technologies rely on platforms and infrastructure where hackers can contribute.”

Today, hackathons aren’t just about coding or networking – they’re about pushing the boundaries of what’s possible, creating meaningful solutions and launching new career paths. They act as incubators for ideas with lasting impact. Zhang wants to help these ideas become reality. “The future of innovation is collaborative and open source,” he says. “The old world relies on corporations building moats around closed-source technology, which is inefficient and inaccessible. The new world is centred around open platform technology, where people can build on top of old projects. This collaborative spirit is what makes the hacker movement so important.”

The post The new hackerpreneur appeared first on CERN Courier.

]]>
Careers Hackathons can kick-start your career, says hacker and entrepreneur Jiannan Zhang. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_CAR_zhang.jpg
Unprecedented progress in energy-efficient RF https://cerncourier.com/a/unprecedented-progress-in-energy-efficient-rf/ Mon, 27 Jan 2025 07:14:38 +0000 https://cerncourier.com/?p=112349 Forty-five experts from industry and academia met in the magnificent city of Toledo for the second workshop on efficient RF sources.

The post Unprecedented progress in energy-efficient RF appeared first on CERN Courier.

]]>
Forty-five experts from industry and academia met in the magnificent city of Toledo, Spain from 23 to 25 September 2024 for the second workshop on efficient RF sources. Part of the I.FAST initiative on sustainable concepts and technologies (CERN Courier July/August 2024 p20), the event focused on recent advances in energy-efficient technology for RF sources essential to accelerators. Progress in the last two years has been unprecedented, with new initiatives and accomplishments around the world fuelled by the ambitious goals of new, high-energy particle-physics projects.

Out of more than 30 presentations, a significant number featured pulsed, high-peak-power RF sources working at frequencies above 3 GHz in the S, C and X bands. These involve high-efficiency klystrons that are being designed, built and tested for the KEK e/e+ Injector, the new EuPRAXIA@SPARC_LAB linac, the CLIC testing facilities, muon collider R&D, the CEPC injector linac and the C3 project. Reported increases in beam-to-RF power efficiency range from 15 percentage points for the retro­fit prototype for CLIC to more than 25 points (expected) for a new greenfield klystron design that can be used across most new projects.

A very dynamic area for R&D is the search of efficient sources for the continuous wave (CW) and long-pulse RF needed for circular accelerators. Typically working in the L-band, existing devices deliver less than 3 MW in peak power. Solid-state amplifiers, inductive output tubes, klystrons, magnetrons, triodes and exotic newly rediscovered vacuum tubes called “tristrons” compete in this arena. Successful prototypes have been built for the High-Luminosity LHC and CEPC with power efficiency gains of 10 to 20 points. In the case of the LHC, this will allow 15% more power without an impact on the electricity bill; in the case of a circular Higgs factory, this will allow a 30% reduction. CERN and SLAC are also investigating very-high-efficiency vacuum tubes for the Future Circular Collider with a potential reduction of close to 50% on the final electricity bill. A collaboration between academia and industry would certainly be required to bring this exciting new technology to light.

Besides the astounding advances in vacuum-tube technology, solid-state amplifiers based on cheap transistors are undergoing a major transformation thanks to the adoption of gallium-nitride technology. Commercial amplifiers are now capable of delivering kilowatts of power at low duty cycles with a power efficiency of 80%, while Uppsala University and the European Spallation Source have demonstrated the same efficiency for combined systems working in CW.

The search for energy efficiency does not stop at designing and building more efficient RF sources. All aspects of operation, power combination and using permanent magnets and efficient modulators need to be folded in, as described by many concrete examples during the workshop. The field is thriving.

The post Unprecedented progress in energy-efficient RF appeared first on CERN Courier.

]]>
Meeting report Forty-five experts from industry and academia met in the magnificent city of Toledo for the second workshop on efficient RF sources. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_FN_WERFSII.jpg
ICFA talks strategy and sustainability in Prague https://cerncourier.com/a/icfa-talks-strategy-and-sustainability-in-prague-2/ Mon, 27 Jan 2025 07:13:18 +0000 https://preview-courier.web.cern.ch/?p=111309 The 96th ICFA meeting heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans.

The post ICFA talks strategy and sustainability in Prague appeared first on CERN Courier.

]]>
ICFA, the International Committee for Future Accelerators, was formed in 1976 to promote international collaboration in all phases of the construction and exploitation of very-high-energy accelerators. Its 96th meeting took place on 20 and 21 July during the recent ICHEP conference in Prague. Almost all of the 16 members from across the world attended in person, making the assembly lively and constructive.

The committee heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans, including a presentation by Paris Sphicas, the chair of the European Committee for Future Accelerators (ECFA), on the process for the update of the European strategy for particle physics (ESPP). Launched by CERN Council in March 2024, the ESPP update is charged with recommending the next collider project at CERN after HL-LHC operation.

A global task

The ESPP update is also of high interest to non-European institutions and projects. Consequently, in addition to the expected inputs to the strategy from European HEP communities, those from non-European HEP communities are also welcome. Moreover, the recent US P5 report and the Chinese plans for CEPC, with a potential positive decision in 2025/2026, and discussions about the ILC project in Japan, will be important elements of the work to be carried out in the context of the ESPP update. They also emphasise the global nature of high-energy physics.

An integral part of the work of ICFA is carried out within its panels, which have been very active. Presentations were given from the new panel on the Data Lifecycle (chair Kati Lassila-Perini, Helsinki), the Beam Dynamics panel (new chair Yuan He, IMPCAS) and the Advanced and Novel Accelerators panel (new chair Patric Muggli, Max Planck Munich, proxied at the meeting by Brigitte Cros, Paris-Saclay). The Instrumentation and Innovation Development panel (chair Ian Shipsey, Oxford) is setting an example with its numerous schools, the ICFA instrumentation awards and centrally sponsored instrumentation studentships for early-career researchers from underserved world regions. Finally, the chair of the ILC International Development Team panel (Tatsuya Nakada, EPFL) summarised the latest status of the ILC Technological Network, and the proposed ILC collider project in Japan.

ICFA noted interesting structural developments in the global organisation of HEP

A special session was devoted to the sustainability of HEP accelerator infrastructures, considering the need to invest efforts into guidelines that enable better comparison of the environmental reports of labs and infrastructures, in particular for future facilities. It was therefore natural for ICFA to also hear reports not only from the panel on Sustainable Accelerators and Colliders led by Thomas Roser (BNL), but also from the European Lab Directors Working Group on Sustainability. This group, chaired by Caterina Bloise (INFN) and Maxim Titov (CEA), is mandated to develop a set of key indicators and a methodology for the reporting on future HEP projects, to be delivered in time for the ESPP update.

Finally, ICFA noted some very interesting structural developments in the global organisation of HEP. In the Asia-Oceania region, ACFA-HEP was recently formed as a sub-panel under the Asian Committee for Future Accelerators (ACFA), aiming for a better coordination of HEP activities in this particular region of the world. Hopefully, this will encourage other world regions to organise themselves in a similar way in order to strengthen their voice in the global HEP community – for example in Latin America. Here, a meeting was organised in August by the Latin American Association for High Energy, Cosmology and Astroparticle Physics (LAA-HECAP) to bring together scientists, institutions and funding agencies from across Latin America to coordinate actions for jointly funding research projects across the continent.

The next in-person ICFA meeting will be held during the Lepton–Photon conference in Madison, Wisconsin (USA), in August 2025.

The post ICFA talks strategy and sustainability in Prague appeared first on CERN Courier.

]]>
Meeting report The 96th ICFA meeting heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans. https://cerncourier.com/wp-content/uploads/2024/09/CCNovDec24_FN_ICFA.jpg
AI treatments for stroke survivors https://cerncourier.com/a/ai-treatments-for-stroke-survivors/ Fri, 24 Jan 2025 15:52:08 +0000 https://cerncourier.com/?p=112345 Data on strokes is plentiful but fragmented, making it difficult to exploit in data-driven treatment strategies.

The post AI treatments for stroke survivors appeared first on CERN Courier.

]]>
Data on strokes is plentiful but fragmented, making it difficult to exploit in data-driven treatment strategies. The toolbox of the high-energy physicist is well adapted to the task. To amplify CERN’s societal contributions through technological innovation, the Unleashing a Comprehensive, Holistic and Patient-Centric Stroke Management for a Better, Rapid, Advanced and Personalised Stroke Diagnosis, Treatment and Outcome Prediction (UMBRELLA) project – co-led by Vall d’Hebron Research Institute and Siemens Healthineers – was officially launched on 1 October 2024. The kickoff meeting in Barcelona, Spain, convened more than 20 partners, including Philips, AstraZeneca, KU Leuven and EATRIS. Backed by nearly €27 million from the EU’s Innovative Health Initiative and industry collaborators, the project aims to transform stroke care across Europe.

The meeting highlighted the urgent need to address stroke as a pressing health challenge in Europe. Each year, more than one million acute stroke cases occur in Europe, with nearly 10 million survivors facing long-term consequences. In 2017, the economic burden of stroke treatments was estimated to be €60 billion – a figure that continues to grow. UMBRELLA’s partners outlined their collective ambition to translate a vast and fragmented stroke data set into actionable care innovations through standardisation and integration.

UMBRELLA will utilise advanced digital technologies to develop AI-powered predictive models for stroke management. By standardising real-world stroke data and leveraging tools like imaging technologies, wearable devices and virtual rehabilitation platforms, UMBRELLA aims to refine every stage of care – from diagnosis to recovery. Based on post-stroke data, AI-driven insights will empower clinicians to uncover root causes of strokes, improve treatment precision and predict patient outcomes, reshaping how stroke care is delivered.

Central to this effort is the integration of CERN’s federated-learning platform, CAFEIN. A decentralised approach to training machine-learning algorithms without exchanging data, it was initiated thanks to seed funding from CERN’s knowledge transfer budget for the benefit of medical applications: now CAFEIN promises to enhance diagnosis, treatment and prevention strategies for stroke victims, ultimately saving countless lives. A main topic of the kickoff meeting was the development of the “U-platform” – a federated data ecosystem co-designed by Siemens Healthineers and CERN. Based on CAFEIN, the infrastructure will enable the secure and privacy preserving training of advanced AI algorithms for personalised stroke diagnostics, risk prediction and treatment decisions without sharing sensitive patient data between institutions. Building on CERN’s expertise, including its success in federated AI modelling for brain pathologies under the EU TRUST­roke project, the CAFEIN team is poised to handle the increasing complexity and scale of data sets required by UMBRELLA.

Beyond technological advancements, the UMBRELLA consortium discussed a plan to establish standardised protocols for acute stroke management, with an emphasis on integrating these protocols into European healthcare guidelines. By improving data collection and facilitating outcome predictions, these standards will particularly benefit patients in remote and underserved regions. The project also aims to advance research into the causes of strokes, a quarter of which remain undetermined – a statistic UMBRELLA seeks to change.

This ambitious initiative not only showcases CERN’s role in pioneering federated-learning technologies but also underscores the broader societal benefits brought by basic science. By pushing technologies beyond the state-of-the-art, CERN and other particle-physics laboratories have fuelled innovations that have an impact on our everyday lives. As UMBRELLA begins its journey, its success holds the potential to redefine stroke care, delivering life-saving advancements to millions and paving the way for a healthier, more equitable future.

The post AI treatments for stroke survivors appeared first on CERN Courier.

]]>
Meeting report Data on strokes is plentiful but fragmented, making it difficult to exploit in data-driven treatment strategies. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_FN_UMBRELLA.jpg
Inside pyramids, underneath glaciers https://cerncourier.com/a/inside-pyramids-underneath-glaciers/ Wed, 20 Nov 2024 13:48:19 +0000 https://cern-courier.web.cern.ch/?p=111476 Coordinated by editors Paola Scampoli and Akitaka Ariga, Cosmic Ray Muography provides an invaluable snapshot of a booming research area.

The post Inside pyramids, underneath glaciers appeared first on CERN Courier.

]]>
Muon radiography – muography for short – uses cosmic-ray muons to probe and image large, dense objects. Coordinated by editors Paola Scampoli and Akitaka Ariga of the University of Bern, the authors of this book provide an invaluable snapshot of this booming research area. From muon detectors, which differ significantly from those used in fundamental physics research, to applications of muography in scientific, cultural, industrial and societal scenarios, a broad cross section of experts describe the physical principles that underpin modern muography.

Hiroyuki Tanaka of the University of Tokyo begins the book with historical developments and perspectives. He guides readers from the first documented use of cosmic-ray muons in 1955 for rock overburden estimation, to current studies of the sea-level dynamics in Tokyo Bay using muon detectors laid on the seafloor and visionary ideas to bring muography to other planets using teleguided rovers.

Scattering methods

Tanaka limits his discussion to the muon-absorption approach to muography, which images an object by comparing the muon flux before and after – or with and without – an object. The muon-scattering approach, which was invented two decades ago, instead exploits the deflection of muons passing through matter that is due to electromagnetic interactions with nuclei. The interested reader will find several examples of the application of muon scattering in other chapters, particularly that on civil and industrial applications by Davide Pagano (Pavia) and Altea Lorenzon (Padova). Scattering methods have an edge in these fields thanks to their sensitivity to the atomic number of the materials under investigation.

Cosmic Ray Muography

Peter Grieder (Bern), who sadly passed away shortly before the publication of the book, gives an excellent and concise introduction to the physics of cosmic rays, which Paolo Checchia (Padova) expands on, delving into the physics of interactions between muons and matter. Akira Nishio (Nagoya University) describes the history and physical principles of nuclear emulsions. These detectors played an important role in the history of particle physics, but are not very popular now as they cannot provide real-time information. Though modern detectors are a more common choice today, nuclear emulsions still find a niche in muography thanks to their portability. The large accumulation of data from muography experiments requires automatic analysis, for which dedicated scanning systems have been developed. Nishio includes a long and insightful discussion on how the nuclear-emulsions community reacted to supply-chain evolution. The transition from analogue to digital cameras meant that most film-producing firms changed their core business or simply disappeared, and researchers had to take a large part of the production process into their own hands.

Fabio Ambrosino and Giulio Saracino of INFN Napoli next take on the task of providing an overview of the much broader and more popular category of real-time detectors, such as those commonly used in experiments at particle colliders. Elaborating on the requirements set by the cosmic rate and environmental factors, their
chapter explains why scintillator and gas-based tracking devices are the most popular options in muography. They also touch on more exotic detector options, including Cherenkov telescopes and cylindrical tracking detectors that fit in boreholes.

In spite of their superficial similarity, methods that are common in X-ray imaging need quite a lot of ingenuity to be adapted to the context of muography. For example, the source cannot be controlled in muography, and is not mono­chromatic. Both energy and direction are random and have a very broad distribution, and one cannot afford to take data from more than a few viewpoints. Shogo Nagahara and Seigo Miyamoto of the University of Tokyo provide a specialised but intriguing insight into 3D image reconstruction using filtered back-projection.

A broad cross section of experts describe the physical principles that underpin modern muography

Geoscience is among the most mature applications of muography. While Jacques Marteau (Claude Bernard University Lyon 1) provides a broad overview of decades of activities spanning from volcano studies to the exploration of natural caves, Ryuichi Nishiyama (Tokyo) explores recent studies where muography provided unique data on the shape of the bedrock underneath two major glaciers in the Swiss Alps.

One of the greatest successes of muography is the study of pyramids, which is given ample space in the chapter on archaeology by Kunihiro Morishima (Nagoya). In 1971, Nobel-laureate Luis Alvarez’s team pioneered the use of muography in archaeology during an investigation at the pyramid of Khafre in Giza, Egypt, motivated by his hunch that an unknown large chamber could be hiding in the pyramid. Their data convincingly excluded that possibility, but the attempt can be regarded as launching modern muography (CERN Courier May/June 2023 p32). Half a century later, muography was reintroduced to the exploration of Egyptian pyramids thanks to ScanPyramids – an international project led by particle-physics teams in France and Japan under the supervision of the Heritage Innovation and Preservation Institute. ScanPyramids aims at systematically surveying all of the main pyramids in the Giza complex, and recently made headlines by finding a previously unknown corridor-shaped cavity in Khufu’s Great Pyramid, which is the second largest pyramid in the world. To support the claim, which was initially based on muography alone, the finding was cross-checked with the more traditional surveying method based on ground penetrating radar, and finally confirmed via visual inspection through an endoscope.

Pedagogical focus

This book is a precious resource for anyone approaching muography, from students to senior scientists, and potential practitioners from both academic and industrial communities. There are some other excellent books that have already been published on the same topic, and that have showcased original research, but Cosmic Ray Muography’s pedagogical focus, which prioritises the explanation of timeless first principles, will not become outdated any time soon. Given each chapter was written independently, there is a certain degree of overlap and some incoherence in terminology, but this gives the reader valuable exposure to different perspectives about what matters most in this type of research.

The post Inside pyramids, underneath glaciers appeared first on CERN Courier.

]]>
Review Coordinated by editors Paola Scampoli and Akitaka Ariga, Cosmic Ray Muography provides an invaluable snapshot of a booming research area. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_REV_muons-1.jpg
Threshold moment for medical photon counting https://cerncourier.com/a/threshold-moment-for-medical-photon-counting/ Mon, 16 Sep 2024 09:01:26 +0000 https://preview-courier.web.cern.ch/?p=111160 The seventh workshop on Medical Applications of Spectroscopic X-ray Detectors was held at CERN from in April.

The post Threshold moment for medical photon counting appeared first on CERN Courier.

]]>
7th Workshop on Medical Applications of Spectroscopic X-ray Detectors participants

The seventh workshop on Medical Applications of Spectroscopic X-ray Detectors was held at CERN from 15 to 18 April. This year’s workshop brought together more than 100 experts in medical imaging, radiology, physics and engineering. The workshop focused on the latest advancements in spectroscopic X-ray detectors and their applications in medical diagnostics and treatment. Such detectors, whose origins are found in detector R&D for high-energy physics, are now experiencing a breakthrough moment in medical practice.

Spectroscopic X-ray detectors represent a significant advancement in medical imaging. Unlike traditional X-ray detectors that measure only the intensity of X-rays, these advanced detectors can differentiate the energies of X-ray photons. This enables enhanced tissue differentiation, improved tumour detection and advanced material characterisation, which may lead in certain cases to functional imaging without the need for radioactive tracers.

The technology has its roots in the 1980s and 1990s when the high-energy-physics community centred around CERN developed a combination of segmented silicon sensors and very large-scale integration (VLSI) readout circuits to enable precision measurements at unprecedented event rates, leading to the development of hybrid pixel detectors (see p37). In the context of the Medipix Collaborations, CERN has coordinated research on spectroscopic X-ray detectors including the development of photon-counting detectors and new semiconductor materials that offer higher sensitivity and energy resolution. By the late 1990s, several groups had proofs of concept, and by 2008, pre-clinical spectral photon-counting computed-tomography (CT) systems were under investigation.

Spectroscopic X-ray detectors offer unparalleled diagnostic capabilities, enabling more detailed imaging and earlier and precise disease detection

In 2011, leading researchers in the field decided to bring together engineers, physicists and clinicians to help address the scientific, medical and engineering challenges associated with guiding the technology toward clinical adoption. In 2021, the FDA approval of Siemens Healthineers’ photon-counting CT scanner marked a significant milestone in the field of medical imaging, validating the clinical benefits of spectroscopic X-ray detectors. The mobile CT scanner, OmniTom Elite from NeuroLogica, approved in March 2022, also integrates photon counting detector (PCD) technology. The 3D colour X-ray scanner developed by MARS Bioimaging, in collaboration with CERN based on Medipix3 technology, has already shown significant promise in pre-clinical and clinical trials. Clinical trials of MARS scanners demonstrated its applications for detecting acute fractures, evaluation of fracture healing and assessment of osseous integration at the bone–metal interface for fracture fixations and joint replacements. With more than 300 million CT scans being performed annually around the world, the potential impact for spectroscopic X-ray imaging is enormous, but technical and medical challenges remain, and the need for this highly specialised workshop continues.

The scientific presentations in the 2024 workshop covered the integration of spectroscopic CT in clinical workflows, addressed technical challenges in photon counting detector technology and explored new semiconductor materials for X-ray detectors. The technical sessions on detector physics and technology discussed new methodologies for manufacturing high-purity cadmium–zinc–tellurium semiconductor crystals and techniques to enhance the quantum efficiency of current detectors. Sessions on clinical applications and imaging techniques included case studies demonstrating the benefits of multi-energy CT in cardiology and neurology, and advances in using spectroscopic detectors for enhanced contrast agent differentiation. The sessions on computational methods and data processing covered the implementation of AI algorithms to improve image reconstruction and analysis, and efficient storage and retrieval systems for large-scale spectral imaging datasets. The sessions on regulatory and safety aspects focused on the regulatory pathway for new spectroscopic X-ray detectors, ensuring patient and operator safety with high-energy X-ray systems.

Enhancing patient outcomes

The field of spectroscopic X-ray detectors is rapidly evolving. Continued research, collaboration and innovation to enhance medical diagnostics and treatment outcomes will be essential. Spectroscopic X-ray detectors offer unparalleled diagnostic capabilities, enabling more detailed imaging and earlier and precise disease detection, which improves patient outcomes. To stay competitive and meet the demand for precision medicine, medical institutions are increasingly adopting advanced imaging technologies. Continued collaboration among researchers, physicists and industry leaders will drive innovation, benefiting patients, healthcare providers and research institutions.

The post Threshold moment for medical photon counting appeared first on CERN Courier.

]]>
Meeting report The seventh workshop on Medical Applications of Spectroscopic X-ray Detectors was held at CERN from in April. https://cerncourier.com/wp-content/uploads/2024/09/CCSepOct24_FN_Medical_feature.jpg
Acceleration, but not as we know it https://cerncourier.com/a/acceleration-but-not-as-we-know-it/ Fri, 05 Jul 2024 09:34:22 +0000 https://preview-courier.web.cern.ch/?p=110800 On-chip acceleration pioneers Robert Byer, Joel England, Peter Hommelhoff and Roy Shiloh report on progress to miniaturise accelerators from centimetres to microns.

The post Acceleration, but not as we know it appeared first on CERN Courier.

]]>
Metal cavities are at the heart of the vast majority of the world’s 30,000 or so particle accelerators. Excited by microwaves, these resonant structures are finely tuned to generate oscillating electric fields that accelerate particles over many metres. But what if similar energies could be delivered 100 times more rapidly in structures a few tens of microns wide or less?

The key is to reduce the wavelength of the radiation powering the structure down to the optical scale of lasers. By combining solid-state lasers and modern nanofabrication, accelerating structures can be as small as a single micron wide. Though miniaturisation will never allow bunch charges as large as in today’s science accelerators, field strengths can be much higher before structure damage sets in. The trick is to replace highly conductive structures with dielectrics like silicon, fused silica and diamond, which have a much higher damage threshold at optical wavelengths. The length of accelerators can thereby be reduced by orders of magnitude, with millions to billions of particle pulses accelerated per second, depending on the repetition rate of the laser.

Recent progress with “on chip” accelerators promises powerful, high-energy and high-repetition-rate particle sources that are accessible to academic laboratories. Applications may range from localised particle or X-ray irradiation in medical facilities to quantum communication and computation using ultrasmall bunches of electrons as qubits.

Laser focused

The inspiration for on-chip accelerators dates back to 1962, when Koichi Shimoda of the University of Tokyo proposed using early lasers – then called optical masers – as a way to accelerate charged particles. The first experiments were conducted by shining light onto an open metal grating, generating an optical surface mode that could accelerate electrons passing above the surface. This technique was proposed by Yasutugu Takeda and Isao Matsui in 1968 and experimentally demonstrated by Koichi Mizuno in 1987 using terahertz radiation. In the 1980s, accelerator physicist Robert Palmer of Brookhaven National Laboratory proposed using rows of free-standing pillars of subwavelength separation illuminated by a laser – an idea that has propagated to modern devices.

The longitudinal electric field in a dual-pillar colonnade illuminated by a laser

In the 1990s, the groups of John Rosenzweig and Claudio Pellegrini at UCLA and Robert Byer at Stanford began to use dielectric materials, which offer low power absorption at optical frequencies. For femtosecond laser pulses, a simple dielectric such as silica glass can withstand optical field strengths exceeding 10 GV/m. It became clear that combining lasers with on-chip fabrication using dielectric materials could subject particles to accelerating forces 10 to 100 times higher than in conventional accelerators.

In the intervening decades, the dream of realising a laser-driven micro-accelerator has been enabled by major technological advances in the silicon-microchip industry and solid-state lasers. These industrial technologies have paved the way to fabricate and test particle accelerators made from silicon and other dielectric materials driven by ultrashort pulses of laser light. The dielectric laser accelerator (DLA) has been born.

Accelerator on a chip

Colloquially called an accelerator on a chip, a DLA is a miniature microwave accelerator reinvented at the micron scale using the methods of optical photonics rather
than microwave engineering. In both cases, the wavelength of the driving field determines the typical transverse structure dimensions: centimetres for today’s microwave accelerators, but between one and 10 μm for optically powered devices.

Other laser-based approaches to miniaturisation are available. In plasma-wakefield accelerators, particles gain energy from electromagnetic fields excited in an ionised gas by a high-power drive laser (CERN Courier May/June 2024 p25). But the details are starkly different. DLAs are powered by lasers with thousands to millions of times lower peak energy. They operate with more than a million times lower electron charges, but at millions of pulses per second. And unlike plasma accelerators, but similarly to their microwave counterparts, DLAs use a solid material structure with a vacuum channel in which an electromagnetic mode continuously imparts energy to the accelerated particles.

Dielectric structures

This mode can be created by a single laser pulse perpendicular to the electron trajectory, two pulses from opposite sides, or a single pulse directed downwards into the plane of the chip. The latter two options offer better field symmetry.

As the laser impinges on the structure, its electrons experience an electromagnetic force that oscillates at the laser frequency. Particles that are correctly matched in phase and velocity experience a forward accelerating force (see “Continuous acceleration” image). Just as the imparted force begins to change sign, the particles enter the next accelerating cycle, leading to continuous energy gain.

In 2013, two early experiments attracted international attention by demonstrating the acceleration of electrons using structured dielectric devices. Peter Hommelhoff’s group in Germany accelerated 28 keV electrons inside a modified electron microscope using a single-sided glass grating (see “Evolution” image, left panel). In parallel, at SLAC, the groups of Robert Byer and Joel England accelerated relativistic 60 MeV electrons using a dual-sided grating structure, achieving an acceleration gradient of 310 MeV/m and 120 keV of energy gain (see “Evolution” image, middle panel).

Teaming up

Encouraged by the experimental demonstration of accelerating gradients of hundreds of MeV/m, and the power efficiency and compactness of modern solid-state fibre lasers, in 2015 the Gordon and Betty Moore Foundation funded an international collaboration of six universities, three government laboratories and two industry partners to form the Accelerator on a Chip International Program (ACHIP). The central goal is to demonstrate a compact tabletop accelerator based on DLA technology. ACHIP has since developed “shoebox” accelerators on both sides of the Atlantic and used them to demonstrate nanophotonics-based particle control, staging, bunching, focusing and full on-chip electron acceleration by laser-driven microchip devices.

Silicon’s compatibility with established nanofabrication processes makes it convenient, but reaching gradients of GeV/m requires materials with higher damage thresholds such as fused silica or diamond. In 2018, ACHIP research at UCLA accelerated electrons from a conventional microwave linac in a dual-sided fused silica structure powered by ultrashort (45 fs) pulses of 800 nm wavelength laser light. The result was an average energy gain of 850 MeV/m and accelerating fields up to 1.8 GV/m – more than double the prior world best in a DLA, and still a world record.

Longitudinal and transverse beam control

Since DLA structures are non-resonant, the interaction time and energy gain of the particles is limited by the duration of the laser pulse. However, by tilting the laser’s pulse front, the interaction time can be arbitrarily increased. In a separate experiment at UCLA, using a laser pulse tilted by 45˚, the interaction distance was increased to more than 700 µm – or 877 structure periods – with an energy gain of 0.315 MeV. The UCLA group has further extended this approach using a spatial light modulator to “imprint” the phase information onto the laser pulse, achieving more than 3 mm of interaction at 800 nm, or 3761 structure periods.

Under ACHIP, the structure design has evolved in several directions, from single-sided and double-sided gratings etched onto substrates to more recent designs with colonnades of free-standing silicon pillars forming the sides of the accelerating channel, as originally proposed by Robert Palmer some 30 years earlier. At present, these dual-pillar structures (see “Evolution” image, right panel) have proven to be the optimal trade-off between cleanroom fabrication complexity and experimental technicalities. However, due to the lower damage threshold of silicon as compared with fused silica, researchers have yet to demonstrate gradients above 350 MeV/m in silicon-based devices.

With the dual-pillar colonnade chosen as the fundamental nanophotonic building block, research has turned to making DLAs into viable accelerators with much longer acceleration lengths. To achieve this, we need to be able to control the beam and manipulate it in space and time, or electrons quickly diverge inside the narrow acceleration channel and are lost on impact with the accelerating structure. The ACHIP collaboration has made substantial progress here in recent years.

Focusing on nanophotonics

In conventional accelerators, quadrupole magnets focus electron beams in a near perfect analogy to how concave and convex lens arrays transport beams of light in optics. In laser-driven nanostructures it is necessary to harness the intrinsic focusing forces that are already present in the accelerating field itself.

On-chip accelerators promise powerful, high-energy and high-repetition-rate particle sources that are accessible to academic laboratories

In 2021, the Hommelhoff group guided an electron pulse through a 200 nm-wide and 80 µm-long structure based on a theoretical lattice designed by ACHIP colleagues at TU Darmstadt three years earlier. The lattice’s alternating-phase focusing (APF) periodically exchanges an electron bunch’s phase-space volume between the transverse dimension across the narrow width of the accelerating channel and the longitudinal dimension along the propagation direction of the electron pulse. In principle this technique could allow electrons to be guided through arbitrarily long structures.

Guiding is achieved by adding gaps between repeating sets of dual-pillar building-blocks (see “Beam control” image). Combined guiding and acceleration has been demonstrated within the past year. To achieve this, we select a design gradient and optimise the position of each pillar pair relative to the expected electron energy at that position in the structure. Initial electron energies are up to 30 keV in the Hommelhoff group, supplied by electron microscopes, and from 60 to 90 keV in the Byer group, using laser-assisted field emission from silicon nanotips. When accelerated, the electrons’ velocities change dramatically from 0.3 to 0.7 times the speed of light or higher, requiring the periodicity of the structure to change by tens of nanometres to match the velocity of the accelerating wave to the speed of the particles.

On-chip accelerator light source

Although focusing in the narrow dimension of the channel is the most critical requirement, an extension of this method to focus beams in the transverse vertical dimension out of plane of the chip has been proposed, which varies the geometry of the pillars along the out-of-plane dimension. Without it, the natural divergence of the beam in the vertical direction eventually becomes dominant. This approach is awaiting experimental realisation.

Acceleration gradients can be improved by optimising material choice, pillar dimensions, peak optical field strength and the duration of the laser pulses. In recent demonstrations, both the Byer and Hommelhoff groups have kept pillar dimensions constant to ease difficulties in uniformly etching the structures during nanofabrication. The complete structure is then a series of APF cells with tapered cell lengths and tapered dual-pillar periodicity. The combination of tapers accommodates both the changing size of the electron beam and the phase matching required due to the increasing electron energy.

In these proof-of-principle experiments, the Hommelhoff group has designed a nanophotonic dielectric laser accelerator for an injection energy of 28.4 keV and an average acceleration gradient of at least 22.7 MeV/m, demonstrating a 43% energy increase over a 500 µm-long structure. The Byer group recently demonstrated the acceleration of a 96 keV beam at average gradients of 35 to 50 MeV/m, reaching a 25% energy increase over 708 µm. The APF periods were in the range of tens of microns and were tapered along with the energy-gain design curve. The beams were not bunched, and by design only 4% of the electrons were captured and accelerated.

One final experimental point has important implications for the future use of DLAs as compact tabletop tools for ultrafast science. Upon interaction with the DLA, electron pulses have been observed to form trains of evenly spaced sub-wavelength attosecond-scale bunches. This effect was shown experimentally by both groups in 2019, with electron bunches measured down to 270 attoseconds, or roughly 4% of the optical cycle.

From demonstration to application

To date, researchers have demonstrated high gradient (GeV/m) acceleration, compatible nanotip electron sources, laser-driven focusing, interaction lengths up to several millimetres, the staging of multiple structures, and attosecond-level control and manipulation of electrons in nanophotonic accelerators. The most recent experiments combine these techniques, allowing the capture of an accelerated electron bunch with net acceleration and precise control of electron dynamics for the first time.

These milestone experiments demonstrate the viability of the nanophotonic dielectric electron accelerator as a scalable technology that can be extended to arbitrarily long structures and ever higher energy gains. But for most applications, beam currents need to increase.

A compelling idea proposes to “copy and paste” the accelerator design in the cleanroom and make a series of parallel accelerating channels on one chip. Another option is to increase the repetition rate of the driving laser by orders of magnitude to produce more electron pulses per second. Optimising the electron sources used by DLAs would also allow for more electrons per pulse, and parallel arrays of emitters on multi-channel devices promise tremendous advantages. Eventually, active nanophotonics can be employed to integrate the laser and electron sources on a single chip.

Once laser and electron sources are combined, we expect on-chip accelerators to become ubiquitous devices with wide-ranging and unexpected applications, much like the laser itself. Future applications will range from medical treatment tools to electron probes for ultrafast science. According to the International Atomic Energy Agency
statistics, 13% of major accelerator facilities around the world power light sources. On-chip accelerators may follow a similar path.

Illuminating concepts

A concept has been proposed for a dielectric laser-driven undulator (DLU) which uses laser light to generate deflecting forces that wiggle the electrons so that they emit coherent light. Combining a DLA and a DLU could take advantage of the unique time structure of DLA electrons to produce ultrafast pulses of coherent radiation (see “Compact light source” image). Such compact new light sources – small enough to be accessible to individual universities – could generate extremely short flashes of light in ultraviolet or even X-ray wavelength ranges, enabling tabletop instruments for the study of material dynamics on ultrafast time scales. Pulse trains of attosecond electron bunches generated by a DLA could provide excellent probes of transient molecular electronic structure.

The generation of intriguing quantum states of light might also be possible with nanophotonic devices

The generation of intriguing quantum states of light might also be possible with nanophotonic devices. This quantum light results from shaping electron wavepackets inside the accelerator and making them radiate, perhaps even leading to on-chip quantum-communication light sources.

In the realm of medicine, an ultracompact self-contained multi-MeV electron source based on integrated photonic particle accelerators could enable minimally invasive cancer treatments with improved dose control.

One day, instruments relying on high-energy electrons produced by DLA technology may bring the science of large facilities into academic-scale laboratories, making novel science endeavours accessible to researchers across various disciplines and minimally invasive medical treatments available to those in need. These visionary applications may take decades to be fully realised, but we should expect developments to continue to be rapid. The biggest challenges will be increasing beam power and transporting beams across greater energy gains. These need to be addressed to reach the stringent beam quality and machine requirements of longer term and higher energy applications.

The post Acceleration, but not as we know it appeared first on CERN Courier.

]]>
Feature On-chip acceleration pioneers Robert Byer, Joel England, Peter Hommelhoff and Roy Shiloh report on progress to miniaturise accelerators from centimetres to microns. https://cerncourier.com/wp-content/uploads/2024/07/CCJulAug24_ONCHIP_frontis.jpg
How to democratise radiation therapy https://cerncourier.com/a/how-to-democratise-radiation-therapy/ Fri, 05 Jul 2024 09:27:03 +0000 https://preview-courier.web.cern.ch/?p=110863 Manjit Dosanjh and Steinar Stapnes tell the Courier about the need to disrupt the market for a technology that is indispensable when treating cancer.

The post How to democratise radiation therapy appeared first on CERN Courier.

]]>
How important is radiation therapy to clinical outcomes today?
Manjit Dosanjh

Manjit Fifty to 60% of cancer patients can benefit from radiation therapy for cure or palliation. Pain relief is also critical in low- and middle-income countries (LMICs) because by the time tumours are discovered it is often too late to cure them. Radiation therapy typically accounts for 10% of the cost of cancer treatment, but more than half of the cure, so it’s relatively inexpensive compared to chemotherapy, surgery or immunotherapy. Radiation therapy will be tremendously important for the foreseeable future.

What is the state of the art?

Manjit The most precise thing we have at the moment is hadron therapy with carbon ions, because the Bragg peak is very sharp. But there are only 14 facilities in the whole world. It’s also hugely expensive, with each machine costing around $150 million (M). Proton therapy is also attractive, with each proton delivering about a third of the radiobiological effect of a carbon ion. The first proton patient was treated at Berkeley in September 1954, in the same month CERN was founded. Seventy years later, we have about 130 machines and we’ve treated 350,000 patients. But the reality is that we have to make the machines more affordable and more widely available. Particle therapy with protons and hadrons probably accounts for less than 1% of radiation-therapy treatments whereas roughly 90 to 95% of patients are treated using electron linacs. These machines are much less expensive, costing between $1M and $5M, depending on the model and how good you are at negotiating.

Most radiation therapy in the developing world is delivered by cobalt-60 machines. How do they work?

Manjit A cobalt-60 machine treats patients using a radioactive source. Cobalt has a half-life of just over five years, so patients have to be treated longer and longer to be given the same dose as the cobalt-60 gets older, which is a hardship for them, and slows the number of patients who can be treated. Linacs are superior because you can take advantage of advanced treatment options that target the tumour using focusing, multi-beams and imaging. You come in from different directions and energies, and you can paint the tumour with precision. To the best extent possible, you can avoid damaging healthy tissue. And the other thing about linacs is that once you turn it off there’s no radiation anymore, whereas cobalt machines present a security risk. One reason we’ve got funding from the US Department of Energy (DOE) is because our work supports their goal of reducing global reliance on high-activity radioactive sources through the promotion of non-radioisotopic technologies. The problem was highlighted by the ART (access to radiotherapy technologies) study I led for International Cancer Expert Corps (ICEC) on the state of radiation therapy in former Soviet Union countries. There, the legacy has always been cobalt. Only three of the 11 countries we studied have had the resources and knowledge to be able to go totally to linacs. Most still have more than 50% cobalt radiation therapy.

The kick-off meeting for STELLA took place at CERN from 29 to 30 May. How will the project work?

Manjit STELLA stands for Smart Technology to Extend Lives with Linear Accelerators. We are an international collaboration working to increase access to radiation therapy in LMICs, and in rural regions in high-income countries. We’re working to develop a linac that is less expensive, more robust and, in time, less costly to operate, service and maintain than currently available options.

Steinar Stapnes

Steinar $1.75M funding from the DOE has launched an 18 month “pre-design” study. ICEC and CERN will collaborate with the universities of Oxford, Cambridge and Lancaster, and a network of 28 LMICs who advise and guide us, providing vital input on their needs. We’re not going to build a radiation-therapy machine, but we will specify it to such a level that we can have informed discussions with industry partners, foundations, NGOs and governments who are interested in investing in developing lower cost and more robust solutions. The next steps, including prototype construction, will require a lot more funding.

What motivates the project?

Steinar The basic problem is that access to radiation therapy in LMICs is embarrassingly limited. Most technical developments are directed towards high-income countries, ultimately profiting the rich people in the world – in other words, ourselves. At present, only 10% of patients in LMICs have access to radiation therapy.

Were working to develop a linac that is less expensive, more robust and less costly to operate, service and maintain than currently available options

Manjit The basic design of the linac hasn’t changed much in 70 years. Despite that, prices are going up, and the cost of service contracts and software upgrades is very high. Currently, we have around 420 machines in Africa, many of which are down for long intervals, which often impacts treatment outcomes. Often, a hospital can buy the linac but they can’t afford the service contract or repairs, or they don’t have staff with the skills to maintain them. I was born in a small village with no gas, electricity or water. I wasn’t supposed to go to school because girls didn’t. I was fortunate to have got an education that enabled me to have a better life with access to the healthcare treatments that I need. I look at this question from the perspective of how we can make radiation therapy available around the world in places such as where I’m originally from.

What’s your vision for the STELLA machine?

Steinar We want to get rid of the cobalt machines because they are not as effective as linacs for cancer treatment and they are a security risk. Hadron-therapy machines are more costly, but they are more precise, so we need to make them more affordable in the future. As Manjit said, globally 90 or 95% of radiation treatments are given by an electron linac, most often running at 6 MeV. In a modern radiation therapy facility today, such linacs are not developing so fast. Our challenge is to make them more reliable and serviceable. We want to develop a workhorse radiation therapy system that can do high-quality treatment. The other, perhaps more important, key parts are imaging and software. CERN has valuable experience here because we build and integrate a lot of detector systems including readout and data-analysis. From a certain perspective, STELLA will be an advanced detector system with an integrated linac.

Are any technical challenges common to both STELLA and to projects in fundamental physics?

Steinar The early and remote prediction of faults is one. This area is developing rapidly, and it would be very interesting for us to deploy this on a number of accelerators. On the detector and sensor side, we would like to make STELLA easily upgradeable, and some of these upgrades could be very much linked to what we want to do for our future detectors. This can increase the industrial base for developing these types of detectors as the medical market is very large. Software can also be interesting, for example for distributed monitoring and learning.

Where are the biggest challenges in bringing STELLA to market?

Steinar We must make medical linacs open in terms of hardware. Hospitals with local experts must be able to improve and repair the system. It must have a long lifetime. It needs to be upgradeable, particularly with regard to imaging, because detector R&D and imaging software are moving quickly. We want it to be open in terms of software, so that we can monitor the performance of the system, predict faults, and do treatment planning off site using artificial intelligence. Our biggest contribution will be to write a specification for a system where we “enforce” this type of open hardware and open software. Everything we do in our field relies on that open approach, which allows us to integrate the expertise of the community. That’s something we’re good at at CERN and in our community. A challenge for STELLA is to build in openness while ensuring that the machines can remain medically qualified and operational at all times.

How will STELLA disrupt the model of expensive service contracts and lower the cost of linacs?

Steinar This is quite a complex area, and we don’t know the solution yet. We need to develop a radically different service model so that developing countries can afford to maintain their machines. Deployment might also need a different approach. One of the work packages of this project is to look at different models and bring in expertise on new ideas. The challenges are not unique to radiation therapy. In the next 18 months we’ll get input from people who’ve done similar things.

A medical linac at the Genolier Clinic

Manjit Gavi, the global alliance for vaccines, was set up 24 years ago to save millions of children who died every year from vaccine-preventable diseases such as measles, TB, tetanus and rubella using vaccinations that were not available to millions of children in poorer parts of the world, especially Africa. Before, people were dying of these diseases, but now they get a vaccination and live. Vaccines and radiation therapy are totally different technologies, but we may need to think that way to really make a critical difference.

Steinar There are differences with respect to vaccine development. A vaccine is relatively cheap, whereas a linac costs millions of dollars. The diseases addressed by vaccines affect a lot of children, more so than cancer, so the patients have a different demographic. But nonetheless, the fact is that there was a group of countries and organisations who took this on as a challenge, and we can learn from their experiences.

Manjit We would like to work with the UN on their efforts to get rid of the disparities and focus on making radiation therapy available to the 70% of the world that doesn’t have access. To accomplish that, we need global buy-in, especially from the countries who are really suffering, and we need governmental, private and philanthropic support to do so.

What’s your message to policymakers reading this who say that they don’t have the resources to increase global access to radiation therapy?

Steinar Our message is that this is a solvable problem. The world needs roughly 5000 machines at $5M or less each. On a global scale this is absolutely solvable. We have to find a way to spread out the technology and make it available for the whole world. The problem is very concrete. And the solution is clear from a technical standpoint.

Manjit The International Atomic Energy Agency (IAEA) have said that the world needs one of these machines for every 200 to 250 thousand people. Globally, we have a population of 8 billion. This is therefore a huge opportunity for businesses and a huge opportunity for governments to improve the productivity of their workforces. If patients are sick they are not productive. Particularly in developing countries, patients are often of a working economic age. If you don’t have good machines and early treatment options for these people, not only are they not producing, but they’re going to have to be taken care of. That’s an economic burden on the health service and there is a knock-on effect on agriculture, food, the economy and the welfare of children. One example is cervical cancer. Nine out of 10 deaths from cervical cancer are in developing countries. For every 100 women affected, 20 to 30 children die because they don’t have family support.

How can you make STELLA attractive to investors?

Steinar Our goal is to be able to discuss the project with potential investor partners – and not only in industry but also governments and NGOs, because the next natural step will be to actually build a prototype. Ultimately, this has to be done by industry partners. We likely cannot rely on them to completely fund this out of their own pockets, because it’s a high-risk project from a business point of view. So we need to develop a good business model and find government and private partners who are willing to invest. The dream is to go into a five-year project after that.

We need to develop a good business model and find government and private partners who are willing to invest

Manjit It’s important to remember that this opportunity is not only linked to low-income countries. One in two UK citizens will get cancer in their lifetime, but according to a study that came out in February, only 25 to 28% of UK citizens have adequate access to radiation therapy. This is also an opportunity for young people to join an industrial system that could actually solve this problem. Radiation therapy is one of the most multidisciplinary fields there is, all the way from accelerators to radio-oncology and everything in between. The young generation is altruistic. This will capture their spirit and imagination.

Can STELLA help close the radiation-therapy gap?

Manjit When the IAEA first visualised radiation-therapy inequalities in 2012, it raised awareness, but it didn’t move the needle. That’s because it’s not enough to just train people. We also need more affordable and robust machines. If in 10 or 20 years people start getting treatment because they are sick, not because they’re dying, that would be a major achievement. We need to give people hope that they can recover from cancer.

The post How to democratise radiation therapy appeared first on CERN Courier.

]]>
Opinion Manjit Dosanjh and Steinar Stapnes tell the Courier about the need to disrupt the market for a technology that is indispensable when treating cancer. https://cerncourier.com/wp-content/uploads/2024/07/1106240_041-scaled.jpg
Iodine vapours impact climate modelling https://cerncourier.com/a/iodine-vapours-impact-climate-modelling/ Wed, 27 Mar 2024 18:53:29 +0000 https://preview-courier.web.cern.ch/?p=110349 Climate models are missing an important source of aerosol particles in polar and marine regions, according to new results from the CLOUD experiment at CERN.

The post Iodine vapours impact climate modelling appeared first on CERN Courier.

]]>
FLOTUS quartz flow-tube system

Climate models are missing an important source of aerosol particles in polar and marine regions, according to new results from the CLOUD experiment at CERN. Atmospheric aerosol particles exert a strong net cooling effect on the climate by making clouds brighter and more extensive, thereby reflecting more sunlight back out to space. However, how aerosol particles form in the atmosphere remains poorly understood, especially in polar and marine regions.

The CLOUD experiment, located in CERN’s East Area, maintains ultra-low contaminant levels and precisely controls all experimental parameters affecting aerosol formation growth under realistic atmospheric conditions. During the past 15 years, the collaboration has uncovered new processes through which aerosol particles form from mixtures of vapours and grow to sizes where they can seed cloud droplets. A beam from the Proton Synchrotron simulates, in the CLOUD chamber, the ionisation from galactic cosmic rays at any altitude in the troposphere.

Globally, the main vapour driving particle formation is thought to be sulphuric acid, stabilised by ammonia. However, ammonia is frequently lacking in polar and marine regions, and models generally underpredict the observed particle-formation rates. The latest CLOUD study challenges this view, by showing that iodine oxoacids can replace the role of ammonia and act synergistically with sulphuric acid to greatly enhance particle-formation rates.

“Our results show that climate models need to include iodine oxoacids along with sulphuric acid and other vapours,” says CLOUD spokesperson Jasper Kirkby. “This is particularly important in polar regions, which are highly sensitive to small changes in aerosol particles and clouds. Here, increased aerosol and clouds actually have a warming effect by absorbing infrared radiation otherwise lost to space, and then re-radiating it back down to the surface.”

The new findings build on earlier CLOUD studies which showed that iodine oxoacids rapidly form particles even in the complete absence of sulphuric acid. At iodine oxoacid concentrations that are typical of marine and polar regions (between 0.1 and 5 relative to those of sulphuric acid), the CLOUD data show that the formation rates of sulphuric acid particles are between 10 and 10,000 times faster than previous estimates.

“Global marine iodine emissions have tripled in the past 70 years due to thinning sea ice and rising ozone concentrations, and this trend is likely to continue,” adds Kirkby. “The resultant increase of marine aerosol particles and clouds, suggested by our findings, will have created a positive feedback that accelerates the loss of sea ice in polar regions, while simultaneously introducing a cooling effect at lower latitudes. The next generation of climate models will need to take iodine vapours and their synergy with sulphuric acid into account.”

The post Iodine vapours impact climate modelling appeared first on CERN Courier.

]]>
News Climate models are missing an important source of aerosol particles in polar and marine regions, according to new results from the CLOUD experiment at CERN. https://cerncourier.com/wp-content/uploads/2024/03/CCMarApr24_NA_Enhancement_feature.jpg
First TIPP in Africa a roaring success https://cerncourier.com/a/first-tipp-in-africa-a-roaring-success/ Wed, 17 Jan 2024 09:44:28 +0000 https://preview-courier.web.cern.ch/?p=110085 The 6th conference on Technology and Instrumentation in Particle Physics highlighted strong knowledge-transfer opportunities.

The post First TIPP in Africa a roaring success appeared first on CERN Courier.

]]>
The Conference of Technology and Instrumentation in Particle Physics (TIPP) is the largest conference of its kind. The sixth edition, which took place in Cape Town from 4 to 8 September 2023 and attracted 250 participants, was the first in Africa. More than 200 presentations covered state-of-the-art developments in detector development and instrumentation in particle physics, astroparticle physics and closely related fields. 

“As South Africa, we regard this opportunity as a great privilege for us to host this year’s edition of the TIPP conference,” said minister of higher education, science and innovation Blade Nzimande during an opening address. He was followed by speeches from Angus Paterson, deputy CEO of the National Research Foundation, and Makondelele Victor Tshivhase, director of the national research facility iThemba LABS.

The South African CERN (SA–CERN) programme within the National Research Foundation and iThemba LABS supports more than 120 physicists, engineers and students that contribute to the ALICE, ATLAS and ISOLDE experiments, and to theoretical particle physics. The SA–CERN programme identifies technology transfer in particle physics as key to South African society. This aligns symbiotically with the technology innovation platform of iThemba LABS to create a platform for innovation, incubation, industry collaboration and growth. For the first time, TIPP 2023 included a dedicated parallel session on technology transfer, which was chaired by Massimo Caccia (University of Insubria), Paolo Giacomelli (INFN Bologna) and Christophe De La Taille (CNRS/IN2P3).

The scientific programme kicked off with a plenary presentation on the implementation of the ECFA detector R&D roadmap in Europe by Thomas Bergauer (HEPHY). Other plenary presentations included overviews on bolometers for neutrinos, the Square Kilometre Array (SKA), technological advances by the LHC experiments, NaI experiments, advances in instrumentation at iThemba LABS, micro-pattern gaseous detectors, inorganic and liquid scintillator detectors, noble liquid experiments, axion detection, water cherenkov detectors for neutrinos, superconducting technology for future colliders and detectors, and the PAUL facility in South Africa.

A panel discussion between former CERN Director-General Rolf Heuer (DESY), Michel Spiro (IRFU) and Manfred Krammer (CERN), Imraan Patel (deputy director general of the Department of Science and Innovation), Angus Paterson and Rob Adam (SKA) triggered an exchange of insights about international research infrastructures such as CERN and SESAME for particle physics and science diplomacy.

Prior to TIPP2023, 25 graduate students from Botswana, Cameroon, Ghana, South Africa and Zambia participated in a school of instrumentation in particle, nuclear and medical physics held at iThemba LABS, comprising lectures, hands-on demonstrations, and insightful presentations by researchers from CERN, DESY and IJCLAB, which provided a global perspective on instrumentation.

The post First TIPP in Africa a roaring success appeared first on CERN Courier.

]]>
Meeting report The 6th conference on Technology and Instrumentation in Particle Physics highlighted strong knowledge-transfer opportunities. https://cerncourier.com/wp-content/uploads/2024/01/CCJanFeb24_FN_CapeTown1.jpg
Pushing the intensity frontier at ECN3 https://cerncourier.com/a/pushing-the-intensity-frontier-at-ecn3/ Wed, 25 Oct 2023 14:12:20 +0000 https://preview-courier.web.cern.ch/?p=109474 A technical design study has been recently launched for a new high-intensity physics programme at CERN’s North Area that will shape the physics landscape at CERN's North Area.

The post Pushing the intensity frontier at ECN3 appeared first on CERN Courier.

]]>
CCNovDec23_NA_ECN3

Following a decision taken during the June session of the CERN Council to launch a technical design study for a new high-intensity physics programme at CERN’s North Area, a recommendation for experiment(s) that can best take advantage of the intense proton beam on offer is expected to be made by the end of 2023.

The design study concerns the extraction of a high-intensity beam from the Super Proton Synchrotron (SPS) to deliver up to a factor of approximately 20 more protons per year to ECN3 (Experimental Cavern North 3). It is an outcome of the Physics Beyond Colliders (PBC) initiative, which was launched in 2016 to explore ways to further diversify and expand the CERN scientific programme by covering kinematical domains that are complementary to those accessible to high-energy colliders, with a focus on programmes for the start of operations after Long Shutdown 3 towards the end of the decade.

CERN is confident in reaching the beam intensities required for all experiments

To employ a high-intensity proton beam at a fixed-target experiment in the North Area and to effectively exploit the protons accelerated by the SPS, the beam must be extracted slowly. In contrast to fast extraction within a single turn of the synchrotron, which utilises kicker magnets to change the path of a passing proton bunch, slow extraction gradually shaves the beam over several hundred thousand turns to produce a continuous flow of protons over a period of several seconds. One important limitation to overcome concerns particle losses during the extraction, foremost on the thin electrostatic extraction septum of the SPS but also along the transfer line leading to the North Area target stations. An R&D study backed by the PBC initiative has shown that it is possible to deflect the protons away from the blade of the electrostatic septum using thin, bent crystals. “Based on the technical feasibility study carried out in the PBC Beam Delivery ECN3 task force, CERN is confident in reaching the beam intensities required for all experiments,” says ECN3 project leader Matthew Fraser.

Currently, ECN3 hosts the NA62 experiment, which searches for ultra-rare kaon decays as well as for feebly-interacting particles (FIPs). Three experimental proposals that could exploit a high-intensity beam in ECN3 have been submitted to the SPS committee, and on 6 December the CERN research board is expected to decide which should be taken forward. The High-Intensity Kaon Experiment (HIKE), which requires an increase of the current beam intensity by a factor of between four and seven, aims to increase the precision on ultra-rare kaon decays to further constrain the Cabibbo–Kobayashi–Maskawa unitarity triangle and to search for decays of FIPs that may appear on the same axis as the dumped proton beam. Looking for off-axis FIP decays, the SHADOWS (Search for Hidden And Dark Objects With the SPS) programme could run alongside HIKE when operated in beam-dump mode. Alternatively, the SHiP (Search for Hidden Particles) experiment would investigate hidden sectors such as heavy neutral leptons in the GeV mass range and also enable access to muon- and tau-neutrino physics in a dedicated beam-dump facility installed in ECN3.

The ambitious programme to provide and prepare the high-intensity ECN3 facility for the 2030s onwards is driven in synergy with the North Area consolidation project, which has been ongoing since Long Shutdown 2. Works are planned to be carried out without impacting the other beamlines and experiments in the North Area, with first beam commissioning of the new facility expected from 2030.

“Once the experimental decision has been made, things will move quickly and the experimental groups will be able to form strong collaborations around a new ECN3 physics facility, upgraded with the help of CERN’s equipment and service groups,” says Markus Brugger, co-chair of the PBC ECN3 task force.

The post Pushing the intensity frontier at ECN3 appeared first on CERN Courier.

]]>
News A technical design study has been recently launched for a new high-intensity physics programme at CERN’s North Area that will shape the physics landscape at CERN's North Area. https://cerncourier.com/wp-content/uploads/2023/10/CCNovDec23_NA_ECN3_feature.jpg
Elevating the performance of ionization vacuum gauges with simulation https://cerncourier.com/a/elevating-the-performance-of-ionization-vacuum-gauges-with-simulation/ Tue, 24 Oct 2023 08:00:25 +0000 https://preview-courier.web.cern.ch/?p=109454 Multiphysics modelling underpinned development of an advanced ionization gauge for pressure measurements in high-vacuum and ultrahigh-vacuum systems.

The post Elevating the performance of ionization vacuum gauges with simulation appeared first on CERN Courier.

]]>
Innovation often becomes a form of competition. It can be thought of as a race among creative people, where standardized tools measure progress toward the finish line. For many who strive for technological innovation, one such tool is the vacuum gauge.

High-vacuum and ultra-high-vacuum (HV/UHV) environments are used for researching, refining and producing many manufactured goods. But how can scientists and engineers be sure that pressure levels in their vacuum systems are truly aligned with those in other facilities? Without shared vacuum standards and reliable tools for meeting these standards, key performance metrics – whether for scientific experiments or products being tested – may not be comparable. To realize a better ionization gauge for measuring pressure in HV/UHV environments, INFICON of Liechtenstein used multiphysics modelling and simulation to refine its product design.

A focus on gas density

The resulting Ion Reference Gauge 080 (IRG080) from INFICON is more accurate and reproducible when compared with existing ionization gauges. Development of the IRG080 was coordinated by the European Metrology Programme for Innovation and Research (EMPIR). This collaborative R&D effort by private companies and government research organizations aims to make Europe’s “research and innovation system more competitive on a global scale”. The project participants, working within EMPIR’S 16NRM05 Ion Gauge project, considered multiple options before agreeing that INFICON’s gauge design best fulfilled the performance goals.

Ion Reference Gauge

Of course, different degrees of vacuum require their own specific approaches to pressure measurement. “Depending on conditions, certain means of measuring pressure work better than others,” explained Martin Wüest, head of sensor technology at INFICON. “At near-atmospheric pressures, you can use a capacitive diaphragm gauge. At middle vacuum, you can measure heat transfer occurring via convection.” Neither of these approaches is suitable for HV/UHV applications. “At HV/UHV pressures, there are not enough particles to force a diaphragm to move, nor are we able to reliably measure heat transfer,” added Wüest. “This is where we use ionization to determine gas density and corresponding pressure.”

The most common HV/UHV pressure-measuring tool is a Bayard–Alpert hot-filament ionization gauge, which is placed inside the vacuum chamber. The instrument includes three core building blocks: the filament (or hot cathode), the grid and the ion collector. Its operation requires the supply of low-voltage electric current to the filament, causing it to heat up. As the filament becomes hotter, it emits electrons that are attracted to the grid, which is supplied with a higher voltage. Some of the electrons flowing toward and within the grid will collide with any free-floating gas molecules that are circulating in the vacuum chamber. Electrons that collide with gas molecules will form ions that then flow toward the collector, with the measurable ion current in the collector proportional to the density of gas molecules in the chamber.

“We can then convert density to pressure, according to the ideal gas law,” explained Wüest. “Pressure will be proportional to the ion current divided by the electron current, [in turn] divided by a sensitivity factor that is adjusted depending on what gas is in the chamber.”

Better by design

Unfortunately, while the operational principles of the Bayard–Alpert ionization gauge are sound and well understood, their performance is sensitive to heat and rough handling. “A typical ionization gauge contains fine metal structures that are held in spring-loaded tension,” said Wüest. “Each time you use the device, you heat the filament to between 1200 and 2000 °C. That affects the metal in the spring and can distort the shape of the filament, [thereby] changing the starting location of the electron flow and the paths the electrons follow.”

At the same time, the core components of a Bayard–Alpert gauge can become misaligned all too easily, introducing measurement uncertainties of 10 to 20% – an unacceptably wide range of variation. “Most vacuum-chamber systems are overbuilt as a result,” noted Wüest, and the need for frequent gauge recalibration also wastes precious development time and money.

The IE514 gauge

With this in mind, the 16NRM05 Ion Gauge project team set a measurement uncertainty target of 1% or less for its benchmark gauge design (when used to detect nitrogen gas). Another goal was to eliminate the need to recalibrate gas sensitivity factors for each gauge and gas species under study. The new design also needed to be unaffected by minor shocks and reproducible by multiple manufacturers.

To achieve these goals, the project team first dedicated itself to studying HV/UHV measurement. Their research encompassed a broad review of 260 relevant studies. After completing their review, the project partners selected one design that incorporates current best practice for ionization gauge design: INFICON’s IE514 extractor-type gauge. Subsequently, three project participants – at NOVA University Lisbon, CERN and INFICON – each developed their own simulation models of the IE514 design. Their results were compared to test results from a physical prototype of the IE514 gauge to ensure the accuracy of the respective models before proceeding towards an optimized gauge design.

Computing the sensitivity factor

Francesco Scuderi, an INFICON engineer who specializes in simulation, used the COMSOL Multiphysics® software to model the IE514. The model enabled analysis of thermionic electron emissions from the filament and the ionization of gas by those electrons. The model can also be used for comray-tracing the paths of generated ions toward the collector. With these simulated outputs, Scuderi could calculate an expected sensitivity factor, which is based on how many ions are detected per emitted electron – a useful metric for comparing the overall fidelity of the model with actual test results.

“After constructing the model geometry and mesh, we set boundary conditions for our simulation,” Scuderi explained. “We are looking to express the coupled relationship of electron emissions and filament temperature, which will vary from approximately 1400 to 2000 °C across the length of the filament. This variation thermionically affects the distribution of electrons and the paths they will follow.”

He continued: “Once we simulate thermal conditions and the electric field, we can begin our ray-tracing simulation. The software enables us to trace the flow of electrons to the grid and the resulting coupled heating effects.”

Next, the model is used to calculate the percentage of electrons that collide with gas particles. From there, ray-tracing of the resulting ions can be performed, tracing their paths toward the collector. “We can then compare the quantity of circulating electrons with the number of ions and their positions,” noted Scuderi. “From this, we can extrapolate a value for ion current in the collector and then compute the sensitivity factor.”

INFICON’s model did an impressive job of generating simulated values that aligned closely with test results from the benchmark prototype. This enabled the team to observe how changes to the modelled design affected key performance metrics, including ionization energy, the paths of electrons and ions, emission and transmission current, and sensitivity.

The end-product of INFICON’s design process, the IRG080, incorporates many of the same components as existing Bayard–Alpert gauges, but key parts look quite different. For example, the new design’s filament is a solid suspended disc, not a thin wire. The grid is no longer a delicate wire cage but is instead made from stronger formed metal parts. The collector now consists of two components: a single pin or rod that attracts ions and a solid metal ring that directs electron flow away from the collector and toward a Faraday cup (to catch the charged particles in vacuum). This arrangement, refined through ray-tracing simulation with the COMSOL Multiphysics® software, improves accuracy by better separating the paths of ions and electrons.

A more precise, reproducible gauge

INFICON, for its part, built 13 prototypes for evaluation by the project consortium. Testing showed that the IRG080 achieved the goal of reducing measurement uncertainty to below 1%. As for sensitivity, the IRG080 performed eight times better than the consortium’s benchmark gauge design. Equally important, the INFICON prototype yielded consistent results during multiple testing sessions, delivering sensitivity repeatability performance that was 13 times better than that of the benchmark gauge. In all, 23 identical gauges were built and tested during the project, confirming that INFICON had created a more precise, robust and reproducible tool for measuring HV/UHV conditions.

“We consider [the IRG080] a good demonstration of [INFICON’s] capabilities,” said Wüest.

• This story has been abridged. Read the full article at https://www.comsol.com/c/f6rx.

COMSOL Multiphysics is a registered trademark of COMSOL

The post Elevating the performance of ionization vacuum gauges with simulation appeared first on CERN Courier.

]]>
Feature Multiphysics modelling underpinned development of an advanced ionization gauge for pressure measurements in high-vacuum and ultrahigh-vacuum systems. https://cerncourier.com/wp-content/uploads/2023/10/CCNovDec23Ad_Comsol_frontis.jpg
Cryogenics at FAIR: adaptability is key https://cerncourier.com/a/cryogenics-at-fair-adaptability-is-key/ Mon, 17 Jul 2023 13:46:36 +0000 https://preview-courier.web.cern.ch/?p=108918 Cryogenics is a core enabling technology in the Facility for Antiproton and Ion Research (FAIR), which is under construction in Germany.

The post Cryogenics at FAIR: adaptability is key appeared first on CERN Courier.

]]>
Holger Kollmus and Marion Kauschke

The Facility for Antiproton and Ion Research (FAIR) in Darmstadt, Germany, represents an ambitious reimagining  of the GSI Helmholtz Center for Heavy Ion Research, one of Europe’s leading accelerator research laboratories. When it comes online for initial user experiments in 2027, FAIR will provide scientists from around the world with a multipurpose accelerator complex that’s built to address a broad-scope research canvas – everything from hadron physics, nuclear structure and astrophysics to atomic physics, materials science and radiation biophysics (as well as downstream applications in cancer therapy and space science). 

At the schematic level, FAIR will generate primary beams – from protons up to uranium ions – as well as secondary beams of antiprotons and rare isotopes. As such, the accelerator facility is optimised to deliver intense and energetic beams of particles to different production targets. The resulting beams will subsequently be steered to various fixed-target experiments or injected into specialist storage rings for in-ring experiments with high-quality beams of secondary antiprotons or radioactive ions. 

GSI accelerators and FAIR facilities

Underpinning all this experimental firepower are FAIR’s main building blocks: the fast-ramping SIS100 synchrotron, which provides intense primary beams; the Super Fragment Separator (Super-FRS), which filters out the exotic ion beams; and the storage rings (see “From here to FAIR”, below). Meanwhile, the existing GSI accelerators (UNILAC and SIS18) will serve as injectors and pre-accelerators for SIS100, while a new proton linac will provide high-intensity injection into the synchrotron chain. Here Holger Kollmus and Marion Kauschke – head and deputy head, respectively, of the GSI/FAIR cryogenics programme – tell CERN Courier how the laboratory’s cryogenic infrastructure and specialist expertise at ultralow temperatures are fundamental to FAIR’s long-term scientific mission.

Let’s start with the basics. How has the cryogenics programme at GSI evolved as FAIR moves from concept to reality? 

HK: While cryogenics does not have an extensive back-story at GSI – only two large-scale experiments have deployed superconducting magnets to date – the strategic decision to build FAIR put ultralow-temperature technology at the heart of GSI’s development roadmap. Consider the requirement for specialist infrastructure to provide at-scale testing of FAIR’s superconducting magnets. A case in point is the Prototype Test Facility (PTF) which, between 2005 and 2012, was used to evaluate five candidate magnet designs. One of these prototypes, the so-called first-of-series (FOS) magnet, was subsequently specified for the SIS100 ring (110 dipole magnets in total, with two spares).

FAIR magnets and CERN test facility

It soon became clear, however, that the PTF’s single test stand was not fit-for-purpose to validate all of the magnets within a reasonable timeframe. Instead, that task was allocated to the Series Test Facility (STF), which came onstream in 2013 with cryogenic plant and equipment provided by Swiss manufacturer Linde Kryotechnik. Informed by lessons learned on the PTF, the STF maximised throughput and workflow efficiency for large-scale testing of the SIS100 dipole magnets.

How did you realise STF workflow efficiencies?

MK: Custom building design and layout are key, with a slide system for the superconducting magnets under test, a bellows-free mounting and accessible interfaces between the feed box, magnet and end box. The feed box and end box enclose the superconducting magnet on both sides for testing, with the former additionally supplying the magnet with liquid helium coolant and electrical current. The liquid helium keeps the magnet at a constant 4.5 K, while shielding (maintained between 50–80 K) reduces any heating of the cryogenically cooled magnet (the so-called “cold mass”). 

SIS100 ring

At the same time, the compressor and STF cold box for the liquid helium are physically separated in an adjacent building, thereby minimising noise and vibration levels in the test environment. The cryogenic distribution system is installed on a gallery to enhance staff access between the four test stands, while the cold box itself has a cooling power of 800 W at 4–5 K, 2000 W at 50–80 K and a liquefaction capacity of 6 g/s. 

All of the SIS100 dipoles have now been tested in the STF, with the facility’s four test stands allowing for “four-stroke” operation. Put simply: on one test stand, the magnet is assembled; the second is in cool-down; the third is cold and the magnet is under test; and the fourth is in warm-up mode. This resulted in each magnet being in the STF hall for about a month, with delivery of one new magnet each week. Worth noting as well that if any magnets had failed under test – though none did – they would have been taken to the PTF without interrupting the “assembly-line” work. 

Does that mean the PTF and STF will now be decommissioned? 

MK: The R&D activity at the PTF and STF is far from over. One dipole magnet is undergoing endurance testing in the PTF, while the STF is being used to test SIS100 quadrupole modules as well as prototypes of other SIS100 and Super-FRS components (such as the transfer lines needed to distribute liquid helium from source and feed boxes for the SIS100 and Super-FRS). When testing at the STF is complete – most likely in 2028 – two of the four test benches will be dismantled and part of the hall will be repurposed for a superconducting CW linac (to be cryogenically supplied by the STF). 

Presumably, the GSI cryogenics team engages with other large-scale facilities to enhance its R&D and test capabilities?

HK: That’s correct. The testing of superconducting magnets requires technical personnel with specialist domain knowledge and expertise  to measure and validate magnetic and electrical properties; provide the cryogenic supply within certain temperature/pressure limits; as well as to measure the magnet calorimetrically (for example, with regard to its heat load). 

Cryogenic by-pass lines

CERN, as a pioneer in superconducting magnets for high-energy physics, is one of our main technology partners. As such, the superconducting magnets for the Super-FRS – dipoles as well as multiplets – are undergoing acceptance testing at CERN on their way to Darmstadt from the manufacturers in Italy, France and Spain. Another joint effort is focused on FAIR’s cryogenic machine control, transferring established solutions for the control of valves, temperature/pressure sensors and a range of other subsystems using the CERN software UNICOS.

So collaboration and knowledge exchange are fundamental to project delivery?

HK: Partnership with other cryogenics groups across Europe underpins our deployment model. The equipment needed for local cryogenic distribution to the magnets, for example, is provided by an in-kind contribution from Wroclaw University of Science and Technology (WUST) – tapping into the Polish team’s work on other large-scale cryogenics projects including the European Spallation Source (ESS) in Sweden and the European XFEL here in Germany. Another strategic R&D partner is the Test Facility for large Magnet and superconducting Line (TFML) in Salerno, Italy. Part of the Istituto Nazionale di Fisica Nucleare (INFN), the TFML’s refrigeration capacity and testing facility are available for SIS100 quadrupole testing, thereby opening up test capacity at GSI for other cryogenic components/subsystems such as feed boxes and current lead boxes. The latter enable the warm-to-cold transition for the electrical current, on the way from the “warm” power converter to the “cold” magnets. 

Where are the big crunch-points for cryogenic cooling within FAIR?

HK: The SIS100 and the Super-FRS are the principal consumers in terms of FAIR’s cryogenic cooling capacity – each with a cold connection to a single large refrigeration plant called CRYO2. The SIS100 (with a circumference of 1100 m) is characterised by high dynamic-load changes with a duration of several hours. In terms of design, the ring comprises an array of dipole and quadrupole magnets in a configuration that exploits an internally cooled superconducting cable (with the superconducting strands cooled using two-phase helium). 

Transportation of the cold box

Operationally, the SIS100 magnets have to be ramped during the acceleration of the heavy ions, with the ramp and repetition rate adapted to the ions and experimental set-up to yield different heat loads at the 4 K level. The change between these different cycles should be as short as possible (of the order of less than one hour), with control of the supply pressure inducing different helium flows for the magnet cooling. 

Installation of FAIR’s warm compressor system

Meanwhile, the Super-FRS (at 350 m long) will contain 1500 tons of cold mass that must be cooled in a realistic timeframe (typically one month). A dedicated cool-down and warm-up unit (CWU), using liquid nitrogen as coolant for a helium circuit, is pivotal in this regard and fulfils the Super-FRS requirements with respect to maximal cool-down rates and temperature differences. 

What are the challenges of integrating FAIR’s cryogenic infrastructure with the existing GSI facilities?  

MK: FAIR’s main cryogenic supply building comprises two independent halls, each having its own foundations. The front hall – which houses the cold box, distribution lines and cryogenic gas management – connects to the SIS100 tunnel via pillars and an arrangement that’s designed to avoid any movement of the transfer line supplying supercritical helium to SIS100. Whereas the rear section – which houses the compressor station – sits on a “floating foundation”, essentially decoupled from the cold-box hall to minimise the impact of any resulting ground-based vibration on the SIS100 ring. 

FAIR’s cold, cold heart

Delivery of the central distribution box

FAIR’s central cryogenic plant, CRYO2, is already installed and will provide a cryogenic capacity of 14 kW at 4–5 K and 50 kW at 50–80 K. Those figures of merit will ultimately enable parallel and independent operation across FAIR’s main cryogenic consumers – servicing, for example, the varying heat loads of SIS100 (for operation of different machine cycles) as well as accommodating the large cold mass of the Super-FRS (and its liquefaction requirements). Campus-wide, the cold helium is transported to the FAIR machines by a 1.5km long distribution system, the installation of which is well under way. 

At the heart of CRYO2 is a helium refrigerator in tandem with oil-cooled screw compressors. To optimise long-term adaptation to load changes, the mass flow-rate of coolant will be regulated in near-stepless fashion using a variable-frequency driver for the compressors. The compressor station itself is set up from five compressor skids, each having its own oil system and including a rough separation of more than 99% of the oil from the process gas. The rest of the oil is separated on the high-pressure side before the gas enters the cold box. As the CWU operates independently from the CRYO2 plant, this compressor has its own oil removal system. 

A host of other design issues have also come into play, so adaptability is key. For starters, given that FAIR is situated in a wooded recreation area for neighbouring communities, the height of the helium storage tanks is limited to the height of the average tree in the vicinity. In the same way, FAIR’s cryo buildings will integrate seamlessly with their surroundings – with the use of roof greening, for example, and a window-free design to cut out light pollution. Energy efficiency is also a priority, with the heat that’s generated during the cryogenic compression process to be recovered and used for heating in other parts of the FAIR facility, while active noise mitigation of the air-conditioning systems will minimise disturbance to wild animals.  

How is the roll-out of FAIR’s cryogenic plant progressing?

FAIR’s cryogenics building

HK: The installation of the cryogenic supply infrastructure in the cryogenic building will be finished this autumn, with the supporting infrastructure – including the electrical supply and cooling water – to be in place before spring 2025. Commissioning of the full cryogenic supply system is scheduled to complete by the end of 2025, with the first experiments at FAIR using superconducting technology to follow in 2027.

The post Cryogenics at FAIR: adaptability is key appeared first on CERN Courier.

]]>
Feature Cryogenics is a core enabling technology in the Facility for Antiproton and Ion Research (FAIR), which is under construction in Germany. https://cerncourier.com/wp-content/uploads/2023/07/CCSupp_Enable_2023_FAIR-feature.jpg
Counting half-lives to a nuclear clock https://cerncourier.com/a/counting-half-lives-to-a-nuclear-clock/ Wed, 05 Jul 2023 10:24:23 +0000 https://preview-courier.web.cern.ch/?p=108774 The observation of the radiative decay of thorium-229m at CERN's ISOLDE facility opens a path to more precise timekeepers.

The post Counting half-lives to a nuclear clock appeared first on CERN Courier.

]]>
The observation at CERN’s ISOLDE facility of a long-sought decay of the thorium-229 nucleus marks a key step towards a clock that could outperform today’s most precise atomic timekeepers. Publishing the results in Nature, an international team has used ISOLDE’s unique facilities to measure, for the first time, the radiative decay of the metastable state of thorium-229m, opening a path to direct laser-manipulation of a nuclear state to build a new generation of nuclear clocks. 

Today’s best atomic clocks, based on periodic transitions between two electronic states of an atom such as caesium or aluminium held in an optical lattice, achieve a relative systematic frequency uncertainty below 1 × 10–18, meaning they won’t lose or gain a second over about 30 billion years. Nuclear clocks would exploit the periodic transition between two states in the vastly smaller atomic nucleus, which couple less strongly to electromagnetic fields and hence are less vulnerable to external perturbations. In addition to offering a more precise timepiece, nuclear clocks could test the constancy of fundamental parameters such as the fine structure or strong-coupling constants, and enable searches for ultralight dark matter (CERN Courier September/October 2022 p32).

Higher precision

In 2003 Ekkehard Peik and Christian Tamm of Physikalisch-Technische Bundesanstalt in Germany proposed a nuclear clock based on the transition between the ground state of the thorium-229 nucleus and its first, higher-energy state. The advantage of the 229mTh isomer compared to almost all other nuclear species is its unusually low excitation level (~8 eV), which in principle allows direct laser manipulation. Despite much effort, researchers have not succeeded until now in observing the radiative decay – which is the inverse process of direct laser excitation – of 229mTh to its ground state. This allows, among other things, the isomer’s energy to be determined to higher precision.

In a novel technique based on vacuum-ultraviolet spectroscopy, lead author Sandro Kraemer of KU Leuven and co-workers used ISOLDE to generate an isomeric beam with atomic mass number A = 229, following the decay chain 229Fr → 229Ra → 229Ac → 229Th/229mTh. A fraction of 229Ac decays to the metastable, excited state of 229Th, the isomer 229mTh. To achieve this, the team incorporated the produced 229Ac into six separate crystals of calcium flouride and magnesium flouride at different thicknesses. They measured the radiation emitted when the isomer relaxes to its ground state using an ultraviolet spectrometer, determining the wavelength of the observed light to be 148.7 nm. This corresponds to an energy of 8.338 ± 0.024 eV – seven times more precise than the previous best measurements.

Our study marks a crucial step in the development of lasers that would make such a clock tick

“ISOLDE is currently one of only two facilities in the world that can produce actinium-229 isotopes in sufficient amounts and purity,” says Kraemer. “By incorporating these isotopes in calcium fluoride or magnesium fluoride crystals, we produced many more isomeric thorium-229 nuclei and increased our chances of observing their radiative decay.”

The team’s novel approach of producing thorium-229 nuclei also made it possible to determine the lifetime of the isomer in the magnesium fluoride crystal, which helps to predict the precision of a thorium-229 nuclear clock based on this solid-state system. The result (16.1 ± 2.5 min) indicates that a clock precision which is competitive with that of today’s most precise atomic clocks is attainable, while also being four orders of magnitude more sensitive to a number of effects beyond the Standard Model.

“Solid-state systems such as magnesium fluoride crystals are one of two possible settings in which to build a future thorium-229 nuclear clock,” says the team’s spokesperson, Piet Van Duppen of KU Leuven. “Our study marks a crucial step in this direction, and it will ease the development of lasers with which to drive the periodic transition that would make such a clock tick.”

The post Counting half-lives to a nuclear clock appeared first on CERN Courier.

]]>
News The observation of the radiative decay of thorium-229m at CERN's ISOLDE facility opens a path to more precise timekeepers. https://cerncourier.com/wp-content/uploads/2023/07/CCJulAug23_NA_isolde.jpg
CERN shares beampipe know-how for gravitational-wave observatories https://cerncourier.com/a/cern-shares-beampipe-know-how-for-gravitational-wave-observatories/ Fri, 12 May 2023 14:18:25 +0000 https://preview-courier.web.cern.ch/?p=108551 Participants of a recent CERN workshop discussed vacuum technologies for next-generation gravitational-wave observatories such as the Einstein Telescope.

The post CERN shares beampipe know-how for gravitational-wave observatories appeared first on CERN Courier.

]]>
The direct detection of gravitational waves in 2015 opened a new window to the universe, allowing researchers to study the cosmos by merging data from multiple sources. There are currently four gravitational wave telescopes (GWTs) in operation: LIGO at two sites in the US, Virgo in Italy, KAGRA in Japan, and GEO600 in Germany. Discussions are ongoing to establish an additional site in India. The detection of gravitational waves is based on Michelson laser interferometry with Fabry-Perot cavities, which reveals the expansion and contraction of space at the level of ten-thousandths of the size of an atomic nucleus, i.e. 10-19 m. Despite the extremely low strain that needs to be detected, an average of one gravitational wave is measured per week of measurement by studying and minimising all possible noise sources, including seismic vibration and residual gas scattering. The latter is reduced by placing the interferometer in a pipe where ultrahigh vacuum is generated. In the case of Virgo, the vacuum inside the two perpendicular 3 km-long arms of the interferometer is lower than 10-9 mbar.

While current facilities are being operated and upgraded, the gravitational-wave community is also focusing on a new generation of GWTs that will provide even better sensitivity. This would be achieved by longer interferometer arms, together with a drastic reduction of noise that might require cryogenic cooling of the mirrors. The two leading studies are the Einstein Telescope (ET) in Europe and the Cosmic Explorer (CE) in the US. The total length of the vacuum vessels envisaged for the ET and CE interferometers is 120 km and 160 km, respectively, with a tube diameter of 1 to 1.2 m. The required operational pressures are typical to those needed for modern accelerators (i.e. in the region of 10-10 mbar for hydrogen and even lower for other gas species). The next generation of GWTs would therefore represent the largest ultrahigh vacuum systems ever built.

The next generation of gravitational-wave telescopes would represent the largest ultrahigh vacuum systems ever built.

Producing these pressures is not difficult, as present vacuum systems of GWT interferometers have a comparable degree of vacuum. Instead, the challenge is cost. Indeed, if the previous generation solutions were adopted, the vacuum pipe system would amount to half of the estimated cost of CE and not far from one-third of ET, which is dominated by underground civil engineering. Reducing the cost of vacuum systems requires the development of different technical approaches with respect to previous-generation facilities. Developing cheaper technologies is also a key subject for future accelerators and a synergy in terms of manufacturing methods, surface treatments and installation procedures is already visible.

Within an official framework between CERN and the lead institutes of the ET study –  Nikhef in the Netherlands and INFN in Italy – CERN’s TE-VSC and EN-MME groups  are sharing their expertise in vacuum, materials, manufacturing and surface treatments with the gravitational-wave community. The activity started in September 2022 and is expected to conclude at the end of 2025 with a technical design report and a full test of a vacuum-vessel pilot sector. During the workshop “Beampipes for Gravitational Wave Telescopes 2023”, held at CERN from 27 to 29 March, 85 specialists from different communities encompassing accelerator and gravitational-wave technologies and from companies that focus on steel production, pipe manufacturing and vacuum equipment gathered to discuss the latest progress. The event followed a similar one hosted by LIGO Livingston in 2019, which gave important directions for research topics.

Plotting a course
In a series of introductory contributions, the basic theoretical elements regarding vacuum requirements and the status of CE and ET studies were presented, highlighting initiatives in vacuum and material technologies undertaken in Europe and the US. The detailed description of current GWT vacuum systems provided a starting point for the presentations of ongoing developments. To conduct an effective cost analysis and reduction, the entire process must be taken into account — including raw material production and treatment, manufacturing, surface treatment, logistics, installation, and commissioning in the tunnel. Additionally, the interfaces with the experimental areas and other services such as civil engineering, electrical distribution and ventilation are essential to assess the impact of technological choices for the vacuum pipes.

The selection criteria for the structural materials of the pipe were discussed, with steel currently being the material of choice. Ferritic steels would contribute to a significant cost reduction compared to austenitic steel, which is currently used in accelerators, because they do not contain nickel. Furthermore, thanks to their body-centred cubic crystallographic structure, ferritic steels have a much lower content of residual hydrogen – the first enemy for the attainment of ultrahigh vacuum – and thus do not require expensive solid-state degassing treatments. The cheapest ferritic steels are “mild steels” which are common materials in gas pipelines after treatment to fight corrosion. Ferritic stainless steels, which contain more than 12% in weight of dissolved chromium, are also being studied for GWT applications. While first results are encouraging, the magnetic properties of these materials must be considered to avoid anomalous transmission of electromagnetic signals and of the induced mechanical vibrations.

Four solutions regarding the design and manufacturing of the pipes and their support system were discussed at the March workshop. The baseline is a 3 to 4 mm-thick tube similar to the ones operational in Virgo and LIGO, with some modifications to cope with the new tunnel environment and stricter sensitivity requirements. Another option is a 1 to 1.5 mm-thick corrugated vessel that does not require reinforcement and expansion bellows. Additionally, designs based on double-wall pipes were discussed, with the inner wall being thin and easy to heat and the external wall performing the structural role. An insulation vacuum would be generated between the two walls without the cleanliness and pressure requirements imposed on the laser beam vacuum. The forces acting on the inner wall during pressure transients would be minimised by opening axial movement valves, which are not yet fully designed. Finally, a gas-pipeline solution was also considered, which would be produced by a half-inch thick wall made of mild steel. The main advantage of this solution is its relatively low cost, as it is a standard approach used in the oil and gas industry. However, corrosion protection and ultrahigh vacuum needs would require surface treatment on both sides of the pipe walls. These treatments are currently under consideration.  For all types of design, the integration of optical baffles (which provide an intermittent reduction of the pipe aperture to block scattered photons) is a matter of intense study, with options for position, material, surface treatment, and installation reported. The transfer of vibrations from the tunnel structure to the baffle is also another hot topic.

The manufacturing of the pipes directly from metal coils and their surface treatment can be carried out at supplier facilities or directly at the installation site. The former approach would reduce the cost of infrastructure and manpower, while the latter would reduce transport costs and provide an additional degree of freedom to the global logistics as storage area would be minimized. The study of in-situ production was brought to its limit in a conceptual study of a process that from a coil could deliver pipes as long as desired directly in the underground areas: The metal coil arrives in the tunnel; then it is installed in a dedicated machine that unrolls the coil and welds the metallic sheet to form the pipe to any length.

These topics will undergo further development in the coming months, and the results will be incorporated into a comprehensive technical design report. This report will include a detailed cost optimization and will be validated in a pilot sector at CERN. With just under two and a half years of the project remaining, its success will demand a substantial effort and resolute motivation. The optimism instilled by the enthusiasm and collaborative approach demonstrated by all participants at the workshop is therefore highly encouraging.

The post CERN shares beampipe know-how for gravitational-wave observatories appeared first on CERN Courier.

]]>
Meeting report Participants of a recent CERN workshop discussed vacuum technologies for next-generation gravitational-wave observatories such as the Einstein Telescope. https://cerncourier.com/wp-content/uploads/2023/05/vacuum_gravitational_waves_meeting.jpg
Deep learning for safer driving https://cerncourier.com/a/deep-learning-for-safer-driving/ Wed, 01 Mar 2023 13:54:03 +0000 https://preview-courier.web.cern.ch/?p=107944 CERN and software company Zenseact have completed a three-year project investigating the use of AI in autonomous vehicles.

The post Deep learning for safer driving appeared first on CERN Courier.

]]>
How quickly can a computer make sense of what it sees without losing accuracy? And to what extent can AI tasks on hardware be performed with limited computing resources? Aiming to answer these and other questions, car-safety software company Zenseact, founded by Volvo Cars, sought out CERN’s unique capabilities in real-time data analysis to investigate applications of machine-learning to autonomous driving. 

In the future, self-driving cars are expected to considerably reduce the number of road-accident fatalities. To advance developments, in 2019 CERN and Zenseact began a three-year project to research machine-learning models that could enable self-driving cars to make better decisions faster. Carried out in an open-source software environment, the project’s focus was “computer vision” – an AI discipline dealing with how computers interpret the visual world and then automate actions based on that understanding.

“Deep learning has strongly reshaped computer vision in the last decade, and the accuracy of image-recognition applications is now at unprecedented levels. But the results of our research show that there’s still room for improvement when it comes to running the deep-learning algorithms faster and being more energy-efficient on resource-limited on-device hardware,” said Christoffer Petersson, research lead at Zenseact. “Simply put, machine-learning techniques might help drive faster decision-making in autonomous cars.” 

The need to react fast and make quick decisions imposes strict runtime requirements on the neural networks that run on embedded hardware in an autonomous vehicle. By compressing the neural networks, for example using fewer parameters and bits, the algorithms can be executed faster and use less energy. For this task, the CERN–Zenseact team chose field-programmable gate arrays (FPGAs) as the hardware benchmark. Used at CERN for many years, especially for trigger readout electronics in the large LHC experiments, FPGAs are configurable integrated circuits that can execute complex decision-making algorithms in periods of microseconds. The main result of the FPGA experiment, says Petersson, was a practical demonstration that computer-vision tasks for automotive applications can be performed with high accuracy and short latency, even on a processing unit with limited computational resources. “The project clearly opens up for future directions of research. The developed workflows could be applied to many industries.”

The compression techniques in FPGAs elucidated by this project could also have a significant effect on “edge” computing, explains Maurizio Pierini of CERN: “Besides improving the trigger systems of ATLAS and CMS, future development of this research area could be used for on-site computation tasks, such as on portable devices, satellites, drones and obviously vehicles.”

The post Deep learning for safer driving appeared first on CERN Courier.

]]>
News CERN and software company Zenseact have completed a three-year project investigating the use of AI in autonomous vehicles. https://cerncourier.com/wp-content/uploads/2023/02/CCMarApr23_NA_Zenseact.jpg
CLEAR highlights and goals https://cerncourier.com/a/clear-highlights-and-goals/ Wed, 01 Mar 2023 13:25:57 +0000 https://preview-courier.web.cern.ch/?p=107873 The CERN Linear Accelerator for Research (CLEAR) offers users a unique R&D facility for applications ranging from plasma accelerators to radiotherapy.

The post CLEAR highlights and goals appeared first on CERN Courier.

]]>
Particle accelerators have revolutionised our understanding of nature at the smallest scales, and continue to do so with facilities such as the LHC at CERN. Surprisingly, however, the number of accelerators used for fundamental research represents a mere fraction of the 50,000 or so accelerators currently in operation worldwide. Around two thirds of these are employed in industry, for example in chip manufacturing, while the rest are used for medical purposes, in particular radiotherapy. While many of these devices are available “off-the-shelf”, accelerator R&D in particle physics remains the principal driver of innovative, next-generation accelerators for applications further afield.

The CERN Linear Electron Accelerator for Research (CLEAR) is a prominent example. Launched in August 2017 (CERN Courier November 2017 p8), CLEAR is a user facility developed from the former CTF3 project which existed to test technologies for the Compact Linear Collider (CLIC) – a proposed e+e collider at CERN that would follow the LHC. During the past five years, beams with a wide range of parameters have been provided to groups from more than 30 institutions across more than 10 nations.

CLEAR was proposed as a response to the low availability of test-beam facilities in Europe. In particular, there was very little time available to users on accelerators with electron beams with an energy of a few hundred MeV, as these tend to be used in dedicated X-ray light-source and other specialist facilities. CLEAR therefore serves as a unique facility to perform R&D towards a wide range of accelerator-based technologies in this energy range. Independent of CERN’s other accelerator installations, CLEAR has been able to provide beams for around 35 weeks per year since 2018, as well as during long shutdowns, and even managing successful operation during the COVID-19 pandemic. 

Flexible physics

As a relatively small facility, CLEAR operates in a flexible fashion. Operators can vary the range of beams available with relative ease by tailoring many different parameters, such as the bunch charge, length and energy, for each user. There is regular weekly access to the machine and, thanks to the low levels of radioactivity, it is possible to gain access to the facility several times per day to adjust experimental setups if needed. Along with CLEAR’s location at the heart of CERN, the facility has attracted an eager stream of users from day one.

CLEAR has attracted an eager stream of users from day one

Among the first was a team from the European Space Agency working in collaboration with the Radiation to Electronics (R2E) group at CERN. The users irradiated electronic components for the JUICE (Jupiter Icy Moons Explorer) mission with 200 MeV electron beams. Their experiments demonstrated that high-energy electrons trapped in the strong magnetic fields around Jupiter could induce faults, so-called single event upsets, in the craft’s electronics, leading to the development and validation of components with the appropriate radiation-hardness. The initial experiment has been built upon by the R2E group to investigate the effect of electron beams on electronics.

Inspecting beamline equipment

As the daughter of CTF3, CLEAR has continued to be used to test the key technological developments necessary for CLIC. There are two prototype CLIC accelerating structures in the facility’s beamline. Originally installed to test CLIC’s unique two-beam acceleration scheme, the structures have been used to study short-range “wakefield kicks” that can deflect the beam away from the planned path and reduce the luminosity of a linear collider. Additionally, prototypes of the high-resolution cavity beam position monitors, which are vital to measure and control the CLIC beam, have been tested, showing promising initial results.

One of the main activities at CLEAR concerns the development and testing of beam instrumentation. Here, the flexibility and the large beam-parameter range provided by the facility, together with easy access, especially in its dedicated in-air test station, have proven to be very effective. CLEAR covers all phases of the development of novel beam diagnostics devices, from the initial exploration of a concept or physical mechanism to the first prototyping and to the testing of the final instrument adapted for use in an operational accelerator. Examples are beam-loss monitors based on optical fibres, and beam-position and bunch-length monitors based on Cherenkov diffraction radiation under development by the beam instrumentation group at CERN.

Advanced accelerator R&D

There is a strong collaboration between CLEAR and the Advanced Wakefield Experiment (AWAKE), a facility at CERN used to investigate proton-driven plasma wakefield acceleration. In this scheme, which promises higher acceleration gradients than conventional radio-frequency accelerator technology and thus more compact accelerators, charged particles such as electrons are accelerated by forcing them to “surf” atop a longitudinal plasma wave that contains regions of positive and negative charges. Several beam diagnostics for the AWAKE beamline were first tested and optimised at CLEAR. A second phase of the AWAKE project, presently being commissioned for operation in 2026, requires a new source of electron beams to provide shorter, higher quality beams. Before its final installation in AWAKE, it is proposed to use this source to increase the range of beam parameters available at CLEAR.

Installation of novel microbeam position monitors

Further research into compact, plasma-based accelerators has been undertaken at CLEAR thanks to the installation of an active plasma lens on the beamline. Such lenses use gases ionised by very high electric currents to provide focusing for beams many orders of magnitude stronger than can be achieved with conventional magnets. Previous work on active plasma lenses had shown that the focusing force was nonlinear and reduced the beam quality. However, experiments performed at CLEAR showed, for the first time, that by simply swapping the commonly used helium gas for a heavier gas like argon, a linear magnetic field could be produced and focusing could be achieved without reducing the beam quality (CERN Courier December 2018 p8). 

Plasma acceleration is not the only novel accelerator technology that has been studied at CLEAR over the past five years. The significant potential of using accelerators to produce intense beams of radiation in the THz frequency range has also been demonstrated. Such light, on the boundary between microwaves and infrared, is difficult to produce, but has a variety of different uses ranging from imaging and security scanning to the control of materials at the quantum level. Compact linear accelerator-based sources of THz light could potentially be advantageous to other sources as they tend to produce significantly higher photon fluxes. By using long trains of ultrashort, sub-ps bunches, it was shown at CLEAR that THz radiation can be generated through coherent transition radiation in thin metal foils, through coherent Cherenkov radiation, and through coherent “Smith–Purcell” radiation in periodic gratings. The peak power emitted in experiments at CLEAR was around 0.1 MW. However, simulations have shown that with relatively minor reductions in the length of the electron bunches it will be possible to generate a peak power of more than 100 MW. 

FLASH forward

Advances in high-gradient accelerator technology for projects like CLIC (CERN Courier April 2018 p32) have led to a surge of interest in using electron beams with energies between 50–250 MeV to perform radiotherapy, which is one of the key tools used in the treatment of cancer. The use of so-called very-high energy electron (VHEE) beams could provide advantages over existing treatment types. Of particular interest is using VHEE beams to perform radiotherapy at ultra-high dose rates, which could potentially generate the so-called FLASH effect in patients. Here, tumour cells are killed while sparing the surrounding healthy tissues, with the potential to significantly improve treatment outcomes. 

FLASH radiotherapy

So far, CLEAR has been the only facility in the world studying VHEE radiotherapy and FLASH with 200 MeV electron beams. As such, there has been a large increase in beam-time requests in this field. Initial tests performed by researchers from the University of Manchester demonstrated that, unlike other types of radiotherapy beams, VHEE beams are relatively insensitive to inhomogeneities in tissue that typically result in less targeted treatment. The team, along with another from the University of Strathclyde, also looked at how focused VHEE beams could be used to further target doses inside a patient by mimicking the Bragg peak seen in proton radiotherapy. Experiments with the University Hospital of Lausanne to try to demonstrate whether the FLASH effect can be induced with VHEE beams are ongoing (CERN Courier January/February 2023 p8). 

Even if the FLASH effect can be produced in the lab, there are issues that need to be overcome to bring it to the clinic. Chief among them is the development of novel dosimetric methods. As CLEAR and other facilities have shown, conventional real-time dosimetric methods do not work at ultra-high dose rates. Ionisation chambers, the main pillar of conventional radiotherapy dosimetry, were shown to have very nonlinear behaviour at such dose rates, and recombination times that were too long. Due to this, CLEAR has been involved in the testing of modified ionisation chambers as well as other more innovative detector technologies from the world of particle physics for use in a future FLASH facility. 

High impact 

As well as being a test-bed for new technologies and experiments, CLEAR has provided an excellent training infrastructure for the next generation of physicists and engineers. Numerous masters and doctoral students have spent a large portion of their time performing experiments at CLEAR either as one-time users or long-term collaborators. Additionally, CLEAR is used for practical accelerator training for the Joint Universities Accelerator School.

Numerous masters and doctoral students have spent time performing experiments at CLEAR

As in all aspects of life, the COVID-19 pandemic placed significant strain on the facility. The planned beam schedule for 2020 and beyond had to be scrapped as beam operation was halted during the first lockdown and external users were barred from travelling. However, through the hard work of the team, CLEAR was able to recover and run at almost full capacity within weeks. Several internal CERN users, many of whom were unable to travel to external facilities, were able to use CLEAR during this period to continue their research. Furthermore, CLEAR was involved in CERN’s own response to the pandemic by undertaking sterilisation tests of personal protective equipment.

Test-beam facilities such as CLEAR are vital for developing future physics technology, and the impact that such a small facility has been able to produce in just a few years is impressive. A variety of different experiments from several different fields of research have been performed, with many more that are not mentioned in this article. Unfortunately for the world of high-energy physics, the aforementioned shortage of accelerator test facilities has not gone away. CLEAR will continue to play its role in helping provide test beams, with operations due to continue until at least 2025 and perhaps long after. There is an exciting physics programme lined up for the next few years, featuring many experiments similar to those that have already been performed but also many that are new, to ensure that accelerator technology continues to benefit both science and society.

The post CLEAR highlights and goals appeared first on CERN Courier.

]]>
Feature The CERN Linear Accelerator for Research (CLEAR) offers users a unique R&D facility for applications ranging from plasma accelerators to radiotherapy. https://cerncourier.com/wp-content/uploads/2023/02/CCMarApr23_CLEAR_table.jpg
CERN, CHUV and THERYQ join forces for FLASH https://cerncourier.com/a/cern-chuv-and-theryq-join-forces-for-flash/ Tue, 10 Jan 2023 12:11:19 +0000 https://preview-courier.web.cern.ch/?p=107564 Tripartite agreement covers the development, regulatory compliance and construction of the first radiotherapy device capable of treating large, deep-seated tumours using the FLASH technique.

The post CERN, CHUV and THERYQ join forces for FLASH appeared first on CERN Courier.

]]>
In November, CERN signed an agreement with the Lausanne University Hospital (CHUV) and medical-technology firm THERYQ to develop a novel “FLASH” radiotherapy device. The device – the first of its kind and based on CERN technology – will use very high-energy electrons (VHEEs) to treat cancers that are resistant to conventional treatments, with reduced side effects. Currently, around one third of cancers are resistant to conventional radiation therapy. 

VHEE FLASH technology has several advantages in addition to being capable of reaching deep-seated tumours. For example, high-energy electrons can be focused and oriented in a way that is almost impossible with X-rays, and radiotherapy devices based on electron accelerator technology will be more compact and less expensive than current proton-based therapy devices. 

FLASH radiotherapy has produced impressive results in pre-clinical animal studies at CHUV, while THERYQ, a spinoff of PMB-ALCEN, in partnership with CHUV, has been developing the technique since the beginning of 2013. CERN has responded to the challenge of producing a high dose of very-high-energy electrons in less than 100 milliseconds, as required for FLASH radiotherapy, by designing a unique accelerator based on CLIC (Compact Linear Collider) technology. The device will include a compact linear accelerator, to be manufactured by THERYQ, and use VHEE beams with energies between 100 and 200 MeV, allowing all types of cancers up to a depth of 20 cm to be treated using the FLASH technique. It is expected to be operational within two years, with the first clinical trials planned for 2025.

The new tripartite agreement between CERN, CHUV and THERYQ covers the development, planning, regulatory compliance and construction of the world’s first radiotherapy device capable of treating large, deep-seated tumours using the FLASH technique. “FLASH therapy embodies the spirit of innovation that drives us in this field,” explains Philippe Eckert, director general of CHUV. “Eager to offer the most effective techniques to patients, we have joined forces with a world-class research centre and a cutting-edge industrial partner to solve a medical, physical and technical problem and find innovative solutions to fight cancer.” 

The post CERN, CHUV and THERYQ join forces for FLASH appeared first on CERN Courier.

]]>
News Tripartite agreement covers the development, regulatory compliance and construction of the first radiotherapy device capable of treating large, deep-seated tumours using the FLASH technique. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_NA_THERYQ.jpg
CERN and Airbus collaboration aims high https://cerncourier.com/a/cern-and-airbus-collaboration-aims-high/ Tue, 10 Jan 2023 12:09:56 +0000 https://preview-courier.web.cern.ch/?p=107566 New partnership will investigate the use of superconducting technologies in the electrical distribution systems of future hydrogen-powered aircraft.

The post CERN and Airbus collaboration aims high appeared first on CERN Courier.

]]>
Superconducting rare-earth barium copper oxide

On 1 December, CERN and Airbus UpNext, a wholly owned subsidiary of Airbus, launched a collaboration to explore the use of superconducting technologies in the electrical distribution systems of future hydrogen-powered aircraft. The partnership will bring together CERN’s expertise in superconducting technologies for particle accelerators and Airbus UpNext’s capabilities in aircraft design and manufacturing to develop a demonstrator known as SCALE (Super-Conductor for Aviation with Low Emissions).

Superconducting technologies could drastically reduce the weight of next-generation aircraft and increase their efficiency. If its expected performances and reliability objectives are achieved, the CERN–Airbus collaboration could reach the ambitious target of flying a fully integrated prototype within the next decade, says the firm. The joint initiative seeks to develop and test in laboratory conditions, an optimised generic superconductor cryogenic (~500 kW) powertrain by the end of 2025. SCALE will be designed, constructed and tested by CERN using Airbus UpNext specifications and CERN technology. It will consist of a DC link (cable and cryostat) with two current leads, and a cooling system based on gaseous helium.

“Partnering with a leading research institute like CERN, which has brought the world some of the most important findings in fundamental physics, will help to push the boundaries of research in clean aerospace as we work to make sustainable aviation a reality,” said Sandra Bour-Schaeffer, CEO of Airbus UpNext. “We are already developing a superconductivity demonstrator called ASCEND (Advanced Superconducting and Cryogenic Experimental powertrain Demonstrator) to study the feasibility of this technology for electrically powered and hybrid aircraft. Combining knowledge obtained from our demonstrator and CERN’s unique capabilities in the field of superconductors makes for a natural partnership.”

The post CERN and Airbus collaboration aims high appeared first on CERN Courier.

]]>
News New partnership will investigate the use of superconducting technologies in the electrical distribution systems of future hydrogen-powered aircraft. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_NA_rebco_feature.jpg
Italy ramps up superconductor R&D https://cerncourier.com/a/italy-ramps-up-superconductor-rd/ Tue, 10 Jan 2023 12:07:55 +0000 https://preview-courier.web.cern.ch/?p=107567 A new project called IRIS will explore high-temperature and high-magnetic-field superconducting technologies both for societal applications and next-generation accelerators.

The post Italy ramps up superconductor R&D appeared first on CERN Courier.

]]>
Developing high-temperature and high-magnetic-field superconducting technologies both for societal applications and next-generation particle accelerators is the goal of a new project in Italy called IRIS, launched in November and led by the INFN. IRIS (Innovative Research Infrastructure on applied Superconductivity) has received a €60 million grant from the Piano Nazionale di Ripresa e Resilienza to create a distributed R&D infrastructure throughout the country. It will focus on cables for low-loss electricity transport, and on the construction of superconducting magnets with high-temperature superconductors (HTS) in synergy with R&D for the proposed Future Circular Collider (FCC) at CERN. The project is estimated to last for 30 months, with more than 50% of the funds going to laboratories in the South of Italy.

One of the main objectives will be the construction in Salerno of a large infrastructure that will host not only a superconducting connection line, but also a centre of excellence for testing future industrial products for high-power connections, with the aim of making high-temperature superconductors less difficult and less expensive to work with. 

“With the IRIS project, Italy assumes a leading position in applied superconductivity, creating a real synergy between research institutions and universities, which will offer an important collaboration opportunity for particle physicists and those involved in the fields of superconductivity and magnetism,” explains IRIS technical coordinator Lucio Rossi of the University of Milan. “An aspect not to be overlooked is also the high educational value of the project, which will guarantee numerous doctoral and high-level training opportunities for about a 100 students, young researchers and technicians.”

The activities of IRIS will be coordinated by the Laboratory of Accelerators and Applied Superconductivity (LASA) in Milan, with many partners including the universities of Genova, Milano, Naples, Salento and Salerno, and the CNR Institute for Superconductors, Innovative Materials and Devices (SPIN). 

“IRIS is a virtuous example of how basic research, and in this case particle and accelerator physics, can provide an important application in other science areas, such as the development of new materials for energy saving that is essential for the creation of high-power cables without dissipation and suitable for the needs of future electricity networks serving new energy sources,” says Pierluigi Campana of INFN Frascati, IRIS scientific coordinator.

The post Italy ramps up superconductor R&D appeared first on CERN Courier.

]]>
News A new project called IRIS will explore high-temperature and high-magnetic-field superconducting technologies both for societal applications and next-generation accelerators. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_NA_iris.jpg
From dreams to beams: SESAME’s 30 year-long journey in science diplomacy https://cerncourier.com/a/from-dreams-to-beams-sesames-30-year-long-journey-in-science-diplomacy/ Mon, 09 Jan 2023 14:27:17 +0000 https://preview-courier.web.cern.ch/?p=107609 SESAME founder Eliezer Rabinovici describes the story behind this beacon for peaceful international collaboration, what its achievements have been, and what the future holds.

The post From dreams to beams: SESAME’s 30 year-long journey in science diplomacy appeared first on CERN Courier.

]]>
The SESAME booster and storage ring

SESAME (Synchrotron-light for Experimental Science and Applications in the Middle East) is the Middle East’s first major international research centre. It is a regional third-generation synchrotron X-ray source situated in Allan, Jordan, which broke ground on 6 January 2003 and officially opened on 16 May 2017. The current members of SESAME are Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey. Active current observers include, among others: the European Union, France, Germany, Greece, Italy, Japan, Kuwait, Portugal, Spain, Sweden, Switzerland, the UK and the US. The common vision driving SESAME is the belief that human beings can work together for a cause that furthers the interests of their own nations and that of humanity as a whole. 

The story of SESAME started at CERN 30 years ago. One day in 1993, shortly after the signature of the Oslo Accords by Israel and the Palestine Liberation Organization, the late Sergio Fubini, an outstanding scientist and a close friend and collaborator, approached me in the corridor of the CERN theory group. He told me that now was the time to test what he called “your idealism”, referring to future joint Arab–Israeli scientific projects. 

CERN is a very appropriate venue for the inception of such a project. It was built after World War II to help heal Europe and European science in particular. Abdus Salam, as far back as the 1950s, identified the light source as a tool that could help thrust what were then considered “third-world” countries directly to the forefront of scientific research. The very same Salam joined our efforts in 1993 as a member of the Middle Eastern Science Committee (MESC), founded by Sergio, myself and many others to forge meaningful scientific contacts in the region. By joining our scientific committee, Salam made public his belief in the value of Arab–Israeli scientific collaborations, something the Nobel laureate had expressed several times in private.

Participants of the SESAME users’ meeting

To focus our vision, that year I gave a talk on the status of Arab–Israeli collaborations at a meeting in Torino held on the occasion of Sergio’s 65th birthday. Afterwards we travelled to Cairo to meet Venice Gouda, the Egyptian minister for higher education, and other Egyptian officials. At that stage we were just self-appointed entrepreneurs. We were told that president Hosni Mubarak had made a decision to take politics out of scientific collaborations with Israel, so together we organized a high-quality scientific meeting in Dahab, in the Sinai desert. The meeting, held in a large Bedouin tent on 19-26 November 1995, brought together about 100 young and senior scientists from the region and beyond. It took place in the weeks after the murder of the Israeli prime minister Yitzhak Rabin, for whom, at the request of Venice Gouda, all of us stood for a moment of silence in respect. The silence echoes in my ears to this day. The first day of the meeting was attended by Jacob Ziv, president of the Israeli Academy of Sciences and Humanities, which had been supporting such efforts in general. It was thanks to the additional financial help of Miguel Virasoro, director-general of ICTP at the time, and also Daniele Amati, director of SISSA, that the meeting was held. All three decisions of support were made at watershed moments and on the spur of the moment. The meeting was followed by a very successful effort to identify concrete projects in which Arab–Israeli collaboration could be beneficial to both sides. 

But attempts to continue the project were blocked by a turn for the worse in the political situation. MESC decided to retreat to Torino, where, during a meeting in November 1996, there was a session devoted to studying the possibilities of cooperation via experimental activities in high-energy physics and light-source science. During that session, the late German scientist Gus Voss suggested (on behalf of himself and Hermann Winnick from SLAC) to bring the parts of a German light source situated in Berlin, called BESSY, which was about to be dismantled, to the Middle East. Former Director-General of CERN Herwig Schopper also attended the workshop. MESC had built sufficient trust among the parties to provide an appropriate infrastructure to turn such an idea into something concrete. 

Targeting excellent science 

A light source was very attractive thanks to the rich diversity of fields that can make use of such a facility, from biology through chemistry, physics and many more to archaeology and environmental sciences. Such a diversity would also allow the formation of a critical mass of real users in the region. The major drawback of the BESSY-based proposal was that there was no way a reconstructed dismantled “old” machine would be able to attract first-class scientists and science. 

Around that time, Fubini asked Schopper, who had a rich experience in managing complex experimental projects, to take a leadership position. The focus of possible collaborations was narrowed down to the construction of a large light source, and it was decided to use the German machine as a nucleus around which to build the administrative structure of the project. The non-relations among several of the members presented a serious challenge. At the suggestion of Schopper, following the example of the way CERN was assembled in the 1950s, the impasse was overcome by using the auspices of UNESCO to deposit the instruments for joining the project. The statutes of SESAME were to a large extent copied from those of CERN. A band of self-appointed entrepreneurs had evolved into a self-declared interim Council of SESAME, with Schopper as its president. The next major challenge was to choose a site.

SESAME beginnings

On 15 March 2000 I flew to Amman for a meeting on the subject. I met Khaled Toukan (the current director-general of SESAME) and, after studying a map sold at the hotel where we met, we discussed which site Israel would support. We also asked that a Palestinian be the director general. Due to various developments, none of which depended on Israel, this was not to happen. The decision on the site venue was taken at a meeting at CERN on 11 April 2000. Jordan, which had and has diplomatic relations with all the parties involved, was selected as the host state. BESSY was dismantled by Russian scientists, placed in boxes and shipped with assembly instructions to the Jordanian desert to be kept until the appropriate moment would arise. This was made possible thanks to a direct contribution by Koichiro Matsuura, director-general of UNESCO at the time, and to the efforts of Khaled Toukan who has served in several ministerial capacities in Jordan. 

With the administrative structure in place, it was time to address the engineering and scientific aspects of the project. Technical committees had designed a totally new machine, with BESSY serving as a boosting component. Many scientists in the region were introduced via workshops to the scientific possibilities that SESAME could offer. Scientific committees considered appropriate “day-one” beamlines, yet that day seemed very far in the future. Technical and scientific directors from abroad helped define the parameters of a new machine and identified appropriate beamlines to be constructed. Administrators and civil servants from the members started meeting regularly in the finance committee. Jordan began to build the facility to host the light source and made major additional financial contributions. 

Transformative agreements

At this stage it was time for the SESAME interim council to transform into a permanent body and in the process cut its umbilical cord from UNESCO. This transformation presented new hurdles because it was required of every member that wished to become a member of the permanent council that its head of state, or someone authorised by the head of state, sign an official document sent to UNESCO stating this wish. 

By 2008 the host building had been constructed. But it remained essentially empty. SESAME had received support from leading light-source labs all over the world – a spiritual source of strength to members to continue with the project. However, attempts to get significant funding failed time and again. It was agreed that the running costs of the project should be borne by the members, but the one-time large cost needed to construct a new machine was outside the budget parameters of most of the members, many of whom did not have a tradition of significant support for basic science. The European Union (EU) supported us in that stage only through its bilateral agreement with Jordan. In the end, several million Euros from those projects did find their way to SESAME, but the coffers of SESAME and its infrastructure remained skeletal.

Changing perceptions

In 2008 Herwig Schopper was succeeded by Chris Llewellyn Smith, another former Director-General of CERN, as president of the SESAME Council. His main challenge was to get the funding needed to construct a new light source and to remove from SESAME the perception that it was simply a reassembled old light source of little potential attraction to top scientists. In addition to searching for sources of significant financial support, there was an enormous amount of work still to be done in formulating detailed and realistic plans for the following years. A grinding systematic effort began to endow SESAME with the structure needed for a modern working accelerator, and to create associated information materials.

Llewellyn Smith, like his predecessor, also needed to deal with political issues. For the most part the meetings of the SESAME Council were totally devoid of politics. In fact, they felt to me like a parallel universe where administrators and scientists from the region get to work together in a common project, each bringing her or his own scars and prejudices and each willing to learn. That said, there were moments when politics did contaminate the spirit forming in SESAME. In some cases, this was isolated and removed from the agenda and in others a bitter taste remains. But these are just at the very margins of the main thrust of SESAME. 

Students, beamline scientists and magnets

The empty SESAME building started to be filled with radiation shields, giving the appearance of a full building. But the absence of the light-source itself created a void. The morale of the local staff was in steady decline, and it seemed to me that the project was in some danger. I decided to approach the ministry of finance in Israel. When I asked if Israel would make a voluntary contribution to SESAME of $5 million, I was not shown the door. Instead they requested to come and see SESAME, after which they discussed the proposal with Israel’s budget and planning committee and agreed to contribute the requested funds on the condition that others join them. 

Each member of the unlikely coalition – consisting of Iran, Israel, Jordan and Turkey – pledged an extra $5 million for the project in an agreement signed in Amman. Since then, Israel, Jordan and Turkey have stood up to their commitment, and Iran claims that it recognises its commitment but is obstructed by sanctions. The support from members encouraged the EU to dedicate $5 million to the project, in addition to the approximately $3 million directed earlier from a bilateral EU–Jordan agreement. In 2015 the INFN, under director Fernando Ferroni, gave almost $2 million. This made it possible to build a hostel, as offered by most light sources, which was named appropriately after Sergio Fubini. Many leading world labs, in a heartwarming expression of support, have donated equipment for future beam lines as well as fellowships for the training of young people.

Point of no return

With their help, SESAME crossed the point of no return. The undefined stuff dreams are made of turned into magnets and girdles made of real hard steel, which I was able to touch as they were being assembled at CERN. The pace of events had finally accelerated, and a star-studded inauguration including attendance by the king of Jordan took place on 16 May 2017. During the ceremony, amazingly, the political delegates of different member states listened to each other without leaving the room (as is the standard practice in other international organisations). Even more unique was that each member-state delegate taking the podium gave essentially the same speech: “We are trying here to achieve understanding via collaboration.”

At that moment the SESAME Council presidency passed from Chris Llewellyn Smith to a third former CERN Director-General, Rolf Heuer. The high-quality 2.5 GeV electron storage ring at the heart of SESAME started operation later that year, driving two X-ray beamlines: one dedicated to X-ray absorption fine structure/X-ray fluorescence (XAFS/XRF) spectroscopy, and another to infrared spectro-microscopy. A third powder-diffraction beamline is presently being added, while a soft X-ray beamline “HESEB” designed and constructed by five Helmholtz research centres is being commissioned. In 2023 the BEAmline for Tomography at SESAME (BEATS) will also be completed, with the construction and commissioning of a beamline for hard X-ray full-field tomography. 

The unique SESAME facility started operating with uncanny normality. Well over 100 proposals for experiments were submitted and refereed, and beam time was allocated to the chosen experiments. Data was gathered, analysed and the results were and are being published in first-rate journals. Given the richness of archaeological and cultural heritage in the region, SESAME’s beamlines offer a highly versatile tool for researchers, conservators and cultural-heritage specialists to work together on common projects. The first SESAME Cultural Heritage Day took place online on 16 February 2022 with more than 240 registrants in 39 countries (CERN Courier July/August 2022 p19). 

Powered by renewable energy

Thanks to the help of the EU, SESAME has also become the world’s first “green” light source, its energy entirely generated by solar power, which also has the bonus of stabilising the energy bill of the machine. There is, however, concern that the only component used from BESSY, the “Microtron” radio-frequency system, may eventually break down, thus endangering the operation of the whole machine. 

SESAME continues to operate on a shoe-string budget. The current approved 2022 budget is about $5.3 million, much smaller than that of any modern light source. I marvel at the ingenuity of the SESAME staff allowing the facility to operate, and am sad to sense indifference to the budget among many of the parties involved. The world’s media has been less indifferent: the BBC, The New York Times, Le Monde, The Washington Post, Brussels Libre, The Arab Weekly, as well as regional newspapers and TV stations, have all covered various aspects of SESAME. In 2019 the AAAS highlighted the significance of SESAME by awarding five of its founders (Chris Llewellyn Smith, Eliezer Rabinovici, Zehra Sayers, Herwig Schopper and Khaled Toukan) with its 2019 Award for Science Diplomacy. 

SESAME was inspired by CERN, yet it was a much more challenging task to construct. CERN was built after the Second World War was over, and it was clear who had won and who had lost. In the Middle East the conflicts are not over, and there are different narratives on who is winning and who is losing, as well as what win or lose means. For CERN it took less than 10 years to set up the original construct; for SESAME it took about 25 years. Thus, SESAME now should be thought of as CERN was in around 1960.

On a personal note, it brings immense happiness that for the first time ever, Israeli scientists have carried out high-quality research at a facility established on the soil of an Arab country, Jordan. Many in the region and beyond have taken their people to a place their governments most likely never dreamt of or planned to reach. It is impossible to give due credit to the many people without whom SESAME would not be the success it is today. 

The non-relations among several of the members presented a serious challenge

In many ways SESAME is a very special child of CERN, and often our children can teach us important lessons. As president of the CERN Council, I can say that the way in which the member states of SESAME conducted themselves during the decades of storms that affect our region serves as a benchmark for how to keep bridges for understanding under the most trying of circumstances. The SESAME spirit has so far been a lighthouse even to the CERN Council, in particular in light of the invasion of Ukraine (an associate member state of CERN) by the Russian Federation. Maintaining this attitude in a stormy political environment is very difficult. 

However SESAME’s story ends, we have proved that the people of the Middle East have within them the capability to work together for a common cause. Thus, the very process of building SESAME has become a beacon of hope to many in our region. The responsibility of SESAME in the next years is to match this achievement with high-quality scientific research, but it requires appropriate funding and help. SESAME is continuing very successfully with its mission to train hundreds of engineers and scientists in the region. Requests for beam time continue to rise, as do the number of publications in top journals. 

If one wants to embark on a scientific project to promote peaceful understanding, SESAME offers at least three important lessons: it should be one to which every country can contribute, learn and profit significantly from; its science should be of the highest quality; and it requires an unbounded optimism and an infinite amount of enthusiasm. My dream is that in the not-so-distant future, people will be able to point to a significant discovery and say “this happened at SESAME”.

The post From dreams to beams: SESAME’s 30 year-long journey in science diplomacy appeared first on CERN Courier.

]]>
Feature SESAME founder Eliezer Rabinovici describes the story behind this beacon for peaceful international collaboration, what its achievements have been, and what the future holds. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_SESAME_frontis1.jpg
Radiotherapy debut for proton linac https://cerncourier.com/a/radiotherapy-debut-for-proton-linac/ Mon, 09 Jan 2023 13:17:51 +0000 https://preview-courier.web.cern.ch/?p=107562 A novel proton accelerator for cancer treatment based on CERN technology is preparing to receive its first patients in the UK.

The post Radiotherapy debut for proton linac appeared first on CERN Courier.

]]>
Hadron therapy, to which particle and accelerator physicists have contributed significantly during the past decades, has treated more than 300,000 patients to date. As collaborations and projects have grown over time, new methods aimed at improving and democratising this type of cancer treatment have emerged. Among them, therapy with proton beams from circular accelerators stands out as a particularly effective treatment: protons can obliterate tumours, sparing the surrounding healthy tissues at higher rates than conventional electron or photon therapy. Unfortunately, present proton- and ion-therapy centres are large and very demanding on the design of buildings, accelerators and gantry systems.

A novel proton accelerator for cancer treatment based on CERN technology is preparing to receive its first patients in the UK. Advanced Oncotherapy (AVO), based in London, has developed a proton-therapy system called LIGHT (Linac Image-Guided Hadron Technology) – the result of more than 20 years of work at CERN and spin-off company ADAM, founded in 2007 to build and test linacs for medical purposes and now AVO’s Geneva-based subsidiary. LIGHT provides a proton beam that allows the delivery of ultra-high dose rates to deep-seated tumours. The initial acceleration to 5 MeV is based on radio-frequency quadrupole (RFQ) technology developed at CERN and supported by CERN’s knowledge transfer group. LIGHT reached the maximum treatment energy of 230 MeV at the STFC Daresbury site on 26 September. Four years after the first 16 m-long prototype was built and tested at LHC Point 2, this novel oncological linac will treat its first patients in collaboration with University Hospital Birmingham at Daresbury during the second half of 2023, marking the first time a proton linear accelerator is used for cancer therapy.

LIGHT operates with components and designs developed by CERN, ENEA, the TERA Foundation and ADAM. Components of note include LIGHT’s RFQ, which contributes to its compact design, as well as 19 radio-frequency modules composed of four side-coupled drift-tube accelerating cavities based on a TERA Foundation design and 15 coupled accelerating cavities with industrial design by ADAM. Each module is controlled to vary the beam energy electronically, 200 times per second, depending on the depth of the tumour layer. This obviates the need for absorbers (or degraders), which greatly reduce the throughput of protons and produce large unwanted radiation, therefore reducing the volume of shielding material required. This design allows the linear accelerator to generate an extremely focused beam of 70 to 230 MeV and to target tumours in three dimensions, by varying the depth at which the radiation dose is delivered much faster than existing circular accelerators.

“Our mission is simple: democratise proton therapy,” says Nicolas Serandour, CEO of AVO. “The only way to fulfill this goal is through the development of a different particle accelerator and this is what we have achieved with the successful testing of the first-ever proton linear accelerator for medical purposes. Importantly, the excitement comes from the fact that cost reduction can be accompanied with better medical outcomes due to the quality of the LIGHT beam, particularly for cancers that still have a low prognosis. I cannot over-emphasise the importance that CERN and ADAM played in making this project a tangible reality for millions of cancer patients.” 

The post Radiotherapy debut for proton linac appeared first on CERN Courier.

]]>
News A novel proton accelerator for cancer treatment based on CERN technology is preparing to receive its first patients in the UK. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_NA_light_new.jpg
Superconducting detector magnets for the future https://cerncourier.com/a/superconducting-detector-magnets-for-the-future/ Tue, 22 Nov 2022 13:18:50 +0000 https://preview-courier.web.cern.ch/?p=107334 Participants at the Superconducting Detector Magnets Workshop discussed the strong demand for developing future superconducting magnets.

The post Superconducting detector magnets for the future appeared first on CERN Courier.

]]>
The Superconducting Detector Magnets Workshop, co-organised by CERN and KEK, was held at CERN from 12 to 14 September in a hybrid format. Joining were 90 participants from 36 different institutes and companies, with 57 on-site and 33 taking part remotely.

The workshop aimed to bring together the physics community, detector magnet designers and industry to exchange ideas and concepts, foster collaboration, and to discuss the needs and R&D development goals for future superconducting detector magnets. A key goal was to address the issue of the commercial availability of aluminium-stabilised Nb-Ti/Cu conductor technology.

Fifteen physics-experiment projects, which had either been approved or are in the design phase, presented their needs and plans for superconducting detector magnets. These experiments covered a wide range of physics programmes for existing and future colliders, non-colliders and a space-based experiment. The presented projects showed a strong demand for aluminium-stabilised Nb-Ti/Cu conductor technology. Other conductor technologies that were featured during the workshop included cable-in-conduit technology (CICC) and aluminium-stabilised high-temperature-superconducting (HTS) technology.

Presentations by leading industrial partners showed that the industrial capability to produce superconducting detector magnets does exist, as long as a suitable conductor is available. It was also shown that aluminium-stabilised Nb-Ti/Cu conductors are currently not commercially available, although an R&D effort is currently on-going with IHEP in China. In particular, the co-extrusion process needed to clad the Nb-Ti/Cu Rutherford cable with aluminium is a key missing ingredient in industry. At the same time, the presentations showed that other ingredients, such as Nb-Ti/Cu wire production, the cabling of strands into a Rutherford cable, the high-purity aluminium stabiliser itself and the technique for welding-on of aluminium-alloy reinforcements for high-strength conductors, are still available.

The main conclusion of the workshop was that, given the need for aluminium-stabilised Nb-Ti/Cu conductors for future superconducting detector magnet projects, it is important that the commercial availability of this conductor is re-established, which would require a leading effort from international institutes through collaboration and cooperation with industry. This world-leading effort will advance technologies to be transferred openly to industry and other laboratories. Of particular importance is the co-extrusion technology needed to bond the aluminium stabiliser to the Rutherford cable. Hybrid-structure technology through electron- beam welding or other approaches to maximise the performance of an Al-stabilised superconductor combined with high-strength Al-alloy is needed for high-stress detector magnets. Back-up solutions such as copper-coated and soldered aluminium stabilisers, copper-based stabilisers and CICC should also be considered. In the long term, aluminium-stabilised HTS technology will be important for specific detector-magnet applications.

The workshop was received with strong interest and enthusiasm, and it is expected that another will be organised in one to two years, depending on the progress being made.

The post Superconducting detector magnets for the future appeared first on CERN Courier.

]]>
Meeting report Participants at the Superconducting Detector Magnets Workshop discussed the strong demand for developing future superconducting magnets. https://cerncourier.com/wp-content/uploads/2022/12/CERN-EX-0702022-04i.jpg
Making high-performance digitisers for big-science projects https://cerncourier.com/a/making-high-performance-digitisers-for-big-science-projects/ Mon, 07 Nov 2022 14:05:58 +0000 https://preview-courier.web.cern.ch/?p=107035 Kacper Matuszyński showcases Teledyne's high-performance digitisers at this year’s Big Science Business Forum.

The post Making high-performance digitisers for big-science projects appeared first on CERN Courier.

]]>

High-energy physics labs like CERN rely on the products and services of countless hi-tech companies, many of whom were represented at this year’s Big Science Business Forum, held in Granada, Spain, from 4–7 October 2022.

In this video, you can hear from Kacper Matuszyński, sales manager for Teledyne SP Devices, which makes high-performance digitisers for data acquisition. Based in Sweden, the company has been part of the multi-billion-dollar Teledyne Technologies since 2017.

“We specialise in high-speed systems, focusing on niches such as mass spectrometry, lidar or medical imaging,” says Matuszyński, speaking at the meeting in Granada. “We are a highly R&D-focused company, developing new products and offering our customised services.”

The post Making high-performance digitisers for big-science projects appeared first on CERN Courier.

]]>
Video Kacper Matuszyński showcases Teledyne's high-performance digitisers at this year’s Big Science Business Forum. https://cerncourier.com/wp-content/uploads/2022/11/teledyne_video_still.jpeg
Taking plasma accelerators to market https://cerncourier.com/a/taking-plasma-accelerators-to-market/ Fri, 07 Oct 2022 14:03:04 +0000 https://preview-courier.web.cern.ch/?p=107005 A $15 million investment will enable US firm TAU Systems to build a marketable laser-driven particle accelerator.

The post Taking plasma accelerators to market appeared first on CERN Courier.

]]>
In 1997, physics undergraduate Manuel Hegelich attended a lecture by a visiting professor that would change the course of his career. A new generation of ultra-short-pulse lasers had opened the possibility to accelerate particles to high energies using high-power lasers, a concept first developed in the late 1970s. “It completely captured my passion,” says Hegelich. “I understood the incredible promise for research and industrial advancement if we could make this technology accessible to the masses.” 

Twenty-five years later, Hegelich founded TAU Systems to do just that. In September the US-based firm secured a $15 million investment to build a commercial laser-driven particle accelerator. The target application is X-ray free-electron lasers (XFELs), only a handful of which exist worldwide due to the need for large radio-frequency linacs to accelerate electrons. Laser-driven acceleration could drastically reduce the size and cost of XFELs, says Hegelich, and offers many other applications such as medical imaging. 

Beam time

“As a commercial customer it is difficult to get time on the European XFEL at DESY or the LCLS at SLAC, but these are absolutely fantastic machines that show you biological and chemical interactions that you can’t see in any other way,” he explains. “TAU Systems’ business model is two-pronged: we will offer beam time, data acquisition and analysis as a full-service supplier as well as complete laser-driven accelerators and XFEL systems for sale to, among others, pharma and biotech, battery and solar technology, and other material-science-driven markets.”  

Laser-driven accelerators begin by firing an intense laser pulse at a gas target to excite plasma waves, upon which charged particles can “surf” and gain energy. Researchers worldwide have been pursuing the idea for more than two decades, demonstrating impressive accelerating gradients. CERN’s AWAKE experiment, meanwhile, is exploring the use of proton-driven plasmas that would enable even greater gradients. The challenge is to be able to extract a stable and reliable beam that is useful for applications.

Hegelich began studying the interaction between ultra-intense electromagnetic fields and matter during his PhD at Ludwig Maximilian University in Munich. In 2002 he went to Los Alamos National Laboratory where he ended up leading their laser-acceleration group. A decade later, the University of Texas at Austin invited him to head up a group there. Hegelich has been on unpaid leave of absence since last year to focus on his company, which currently numbers 14 employees and rising. “We have got to a point where we think we can make a product rather than an experiment,” he explains. 

The breakthrough was to inject the gas target with nanoparticles with the right properties at the right time, so as to seed the wakefield sooner and thus enable a larger portion of the wave to be exploited. The resulting electron beam contains so much charge that it drives its own wave, capable of accelerating electrons to 10 GeV over a distance of just 10 cm, explains Hegelich. “The whole community has been chasing 10 GeV for a very long time, because if you ever wanted to build a big collider, or drive an XFEL, you’d need to put together 10 GeV acceleration stages. While gains were theorised, we saw something that was so much more powerful than what we were hoping for. Sometimes it’s better to be lucky than to be good!”

The breakthrough was to inject the gas target with nanoparticles with the right properties at the right time

Hegelich says he was also lucky to attract an investor, German internet entrepreneur Lukasz Gadowski, so soon after he started looking last summer. “This is hardware development: it takes a lot of capital just to get going. Lukasz and I met by accident when I was consulting on a totally different topic. He has invested $15 million and is very interested in the technical side.” 

TAU Systems (the name comes from the symbol used for the laser pulse duration) aims to offer its first products for sale in 2024, have an XFEL service centre operational by 2026 and start selling full XFEL systems by 2027. Improving beam stability will remain the short-term focus, says Hegelich. “At Texas we have a laser system that shoots once per hour or so, with no feedback loop, so sometimes you get a great shot and most of the time you don’t. But we have done some experiments in other regimes with smaller lasers, and other groups have done remarkable work here and shown that it is possible to run for three days straight. Now that we have this company, I can hire actual engineers and programmers – a luxury I simply didn’t have as a university professor.”

He also doesn’t rule out more fundamental applications such as high-energy physics. “I am not going to say that we will replace a collider with a laser, although if things take off and if there is a multibillion-dollar project, then you never know.”

The post Taking plasma accelerators to market appeared first on CERN Courier.

]]>
Careers A $15 million investment will enable US firm TAU Systems to build a marketable laser-driven particle accelerator. https://cerncourier.com/wp-content/uploads/2022/11/CCNovDec22_CAREERS_TAU.jpg
Neutron science: simplifying access for industry users https://cerncourier.com/a/neutron-science-simplifying-access-for-industry-users/ Fri, 16 Sep 2022 10:46:09 +0000 https://preview-courier.web.cern.ch/?p=106714 The ILL is positioning neutron science as a natural extension of industry’s R&D and innovation pipeline. Caroline Boudou and Mark Johnson share the lessons for other large-scale facilities.

The post Neutron science: simplifying access for industry users appeared first on CERN Courier.

]]>
The Institut Laue-Langevin (ILL) is an international research centre at the leading edge of neutron science and technology. As a service institute, the ILL makes its expertise available to about 1400 researchers every year across a suite of 40 state-of-the-art instruments. Taken together, those instruments provide the engine-room for a portfolio of unique analytical techniques that enables process, materials and device characterisation far beyond what’s possible in a traditional academic or industry laboratory – as well as spanning a diversity of disciplines from the physical sciences and engineering to pharmaceutical R&D, food science and cultural heritage.  

Yet while neutrons are unique and ubiquitous, they are neither widely available nor routinely accessible for applications in front-line research. Intense, tunable neutron beams can only be produced at nuclear reactors (like the ILL) or with high-power proton accelerators (so-called spallation sources). By default, there are no laboratory-based neutron sources for initial training of early-career scientists and preliminary experiments – in contrast to the established development pathway afforded scientists transitioning from laboratory X-ray techniques to the large-scale synchrotron X-ray facilities. 

By extension, scientists seeking to access neutrons as a research tool can only do by securing beam time at large-scale facilities. That’s not always straightforward. Large-scale neutron facilities have their own dedicated proposal and contract mechanisms for accessing beam time, adding to perceptions of “impenetrability” for new and occasional users – especially those working in industry. All of which begs a leading question: how can ILL – indeed Europe’s large-scale research facilities generally – build on their successes to date in engaging the industrial R&D community and, in so doing, broaden their collective user base while simultaneously amplifying their societal and economic impact?

Scaling industry engagement

The Industry Liaison Unit (ILU) at ILL, nominally with two full-time staff, leads the laboratory’s industry outreach activities – typically through dedicated local, national and international industry events, all of which are supported by an active online and social media presence. By preparing contracts and agreements as required, the ILU also acts as the interface between the industry partners and ILL instrument scientists who will perform the experiments.

Access routes to ILL instruments

Working with industry can proceed along several routes, each with its own merits. For context, 80% of beam time at ILL is awarded through a competitive peer-review process, with two calls for proposals each year. For this so-called public access model, including precompetitive research at low technology readiness level (TRL), the ILL data policy requires open data and publishable results, with industry partners in many cases collaborating with academic research groups (and, in turn, tapping the latter’s high level of expertise in neutron science). Such “indirect” use of the ILL facilities is the most common access model for industry – though conversely the most difficult for the ILU to capture since the industry partners are not always “visible” members of the research collaboration.

At the other end of the spectrum, and often for projects with a high TRL, industry is able to request proprietary beam time for business-critical research using a paid-for access model. In return, any resulting experimental data remains private, the work can be covered by a non-disclosure agreement, any resulting intellectual property (IP) stays with the client, and experiments are scheduled on an appropriate timescale (usually as soon as possible). Often this sort of work may take the form of a consultancy, in which case the results (rather than just experimental data) are delivered by ILL scientists with the support of the ILU. 

Pyrotechnic equipment for applications in rocket launchers

In between these limiting cases, precompetitive research collaborations are an ideal way to build long-term industry engagement with ILL. Backed by European and/or national research funding, these initiatives typically run for several years and are often governed by memoranda of understanding. As such, the collaborative model allows an industry partner to gain experience and confidence at ILL while confirming the feasibility (or not) of its R&D goals – all of which can potentially lead on to requests for proprietary beam time. Partners often include established European technology companies – the likes of Rolls-Royce, EDF and STMicroelectronics, for example – or research and technical organisations (RTOs) – among them the French Alternative Energies and Atomic Energy Commission (CEA) and Germany’s Fraunhofer institutes. 

Operationally, the main experimental techniques used by industry at ILL are neutron imaging (specifically, radiography and tomography – the former used to reveal the internal structure of manufactured components, while the latter generates 3D images of a sample by measuring neutron absorbance); small-angle neutron scattering or SANS (elastic neutron scattering to investigate the structure of samples at the mesoscopic scale between 1–100 nm); powder diffraction (a form of elastic scattering that reveals atomic and magnetic structures); and strain scanning (which provides insights into strain and stress fields deep within an engineering component). Worth adding as well that neutron imaging takes full advantage of the very intense, continuous neutron beams at ILL and offers potential for significant growth in industry engagement, assuming that capacity can be created to match demand.

Where we are now 

It’s fair to say that ILL’s engagement with industry, while on an upward trajectory, remains a work in progress. The latest estimate of “indirect” use of beam time at ILL is that 15% of experiments (about 100 per year) are industry-relevant and likely to involve an industry partner. By monitoring who actually comes to ILL to carry out experiments, the ILU has identified 106 companies using the facility over the last decade. What’s more, proprietary beam time in 2021 involved 26 measurement periods by 14 unique industry customers with an aggregate income to ILL of €0.51 million.  

Custom ILL detector

In large part, though, it is pan-European and national research collaborations (precompetitive, low TRL) that continue to stimulate industry interaction with ILL. A case in point is the Integrated Infrastructure Initiative for Neutron Scattering and Muon Spectroscopy  (NMI3), in which dedicated workshops were held with diverse industry partners, with the follow-on project, Science and Innovation with Neutrons in Europe 2020 (SINE2020), including an industry consultancy work package with a budget of €1.5 million. The primary focus of the work package was to provide free feasibility studies for companies seeking to evaluate neutron techniques versus their R&D requirements. In all, SINE2020 carried out 37 studies, 14 of which were conducted by ILL. 

Two current European projects, BrightnESS-2 and EASI-STRESS, have a common focus on neutron measurements of residual stress – a critical factor, for example, in the mechanical stability of 3D-printed (additive-manufactured) components. The EASI-STRESS project aims to strengthen industrial access and uptake of non-destructive synchrotron X-ray and neutron-diffraction-based characterisation tools. The goal: to enable a better understanding of the formation and progression of residual stresses by direct incorporation of measured data into modelling tools. In parallel, part of the BrightnESS-2 remit is to support ongoing work at ILL about the standardisation of measurements across neutron facilities and instruments, delivering a quality approach that has been formalised as a Neutron Quality Label trademark.  

A space-launcher fuel tank section

In this respect, it helps that ILL is colocated on the European Photon and Neutron Campus in Grenoble – a geographical convenience that allows close coordination with the adjacent European Synchrotron Radiation Facility (ESRF) when engaging with existing and prospective industry users. The two laboratories are core partners in the EASI-STRESS initiative as well as BIG-MAP, another pan-European project that brings together academic and industry partners on low-TRL battery research as part of the EU’s Battery2030+ roadmap. 

ILL and ESRF are also collaborating within the national IRT Nanoelec project, which enables ILL to showcase its unique analytical capabilities to address key R&D questions in the microelectronics industry. In this context, ILL has created a dedicated irradiation station that allows users to evaluate the sensitivity – and reliability – of electronic components subjected to low-energy (thermal) neutrons.

Industry success stories

Successful case studies for industry engagement can be found along several operational coordinates at ILL. In terms of the proprietary access model – with paid-for beam time plus full IP rights and confidentiality allocated to the customer – projects will typically focus on targeted lines of enquiry relating to a company’s manufacturing, R&D or failure-analysis requirements. 

As part of its quality control, for example, French aerospace company Dassault Aviation makes regular use of the ILL’s neutron imaging capability, with the focus on radiographic analysis of high-reliability pyrotechnic equipment for rocket launchers such as Ariane. Materials and process innovation also underpin a series of ILL measurements carried out by OHB and MT Aerospace (a group of companies specialising in space transportation, satellites and aircraft equipment), mainly to investigate residual stress on friction stir welds (when two facing metal workpieces are joined together by the heat generated from friction). The non-destructive determination of strain and stress maps provides primary data to optimise the company’s numerical models while also benchmarking versus destructive lab-based analysis techniques.

Notwithstanding the proprietary pathway, collaborative projects represent the most popular route for direct interaction between ILL and industrial researchers and RTOs. For example, a joint R&D initiative on metal additive manufacturing (MAM) kicked off in 2018 with the Fraunhofer-Institut für Werkstoff-und Strahltechnik (IWS) in Dresden. Using in-situ laser printing at SALSA, the stress-scanning instrument at ILL, the partners are delivering new knowledge of lasing parameters to ensure robust industrial production of MAM pieces. The initial 24-month project, involving teams from ILL and the Fraunhofer IWS, will be followed by further measurements in 2023 (part of a European Space Agency project that will also include additional X-ray imaging measurements at the ESRF).

InnovaXN: reinforcing industry connections

InnovaXN PhD programme

The InnovaXN PhD programme represents a ground-breaking approach to working with industry – a joint initiative between the ESRF and ILL in which 40 research projects are split between the two large-scale facilities (although most projects require the use of both synchrotron and neutron analytical probes). Launched in 2019, the programme is co-funded by the ESRF (25%) and ILL (25%), with the remainder covered by a Marie Skłodowska-Curie Actions grant agreement within the European Union’s Horizon 2020 programme. 

All InnovaXN projects have an academic and an industry partner (as well as the ESRF and/or ILL), with each PhD student spending at least three months at the industry partner during the course of their research. In this way, the programme attracts industry R&D teams and activities to ESRF and ILL to explore the use of their unique, cutting-edge synchrotron and neutron capabilities for precompetitive research. 

Equally important is the fact that InnovaXN students are exposed to the industry research environment, offering an additional career path post-PhD (either with the industry PhD partner or with another company). This represents the best form of industry awareness for ESRF and ILL, effectively seeding trained scientists in an industry setting. On the other hand, if students end up pursuing an academic career pathway, they will know how to collaborate with industry and how to exploit large-scale facilities when necessary. A win-win.

So far, there have been two intakes of InnovaXN students (in 2020 and 2021). In-progress projects involve 35 unique industry partners (some partners are involved in more than one project), with a quarter of these being SMEs or technology R&D centres. The top three industry sectors covered are energy production and storage; catalysis and chemistry; and pharmaceuticals and biotechnologies – a ranking that reflects the broad reach of neutron and synchrotron techniques for industrial applications.

Another example of collaboration involves Airbus Avionics, which is running a project to mitigate the risks associated with high- and low-energy (thermal) neutrons for avionics programmes. The ILL’s instruments were first used by Airbus to predict thermal neutron risks for state-of-the art semiconductor technologies – with direct measurements being the only way to estimate the real thermal neutron flux inside an aircraft. In 2021, the ILL therefore provided thermal neutron detectors for on-board use in commercial flights, whilst also sharing its technical expertise in this area. The design, development and implementation of advanced neutron detectors is at the heart of the ILL’s activity, as all of its scientific instruments require detectors with unique technical specifications. 

Meanwhile, there are many examples of precompetitive research performed at ILL in partnership with industry, often linked to the “indirect” use of the facility highlighted previously. A timely example is the work involving pharmaceutical company AstraZeneca, in which SANS was used to study lipid nanoparticles containing messenger RNA2 – the delivery mechanism for COVID-19 vaccines produced by Pfizer-BioNTech and Moderna. BioNTech also performed a SANS experiment at ILL in 2020. 

Lessons learned, new perspectives 

With these and other notable success stories to build on, it’s evident that industry use of large-scale research facilities like ILL will remain on an upward trajectory for the foreseeable future. Yet while the laboratory’s near-term thrust is on outreach to industry – raising awareness of the unique R&D opportunities herein – there’s also a requirement for a dedicated selection path for applied R&D projects, with appropriate criteria to give industry streamlined access. Equally important is the ability for companies to study industry-relevant processes, samples and devices on ILL beam lines (“bringing industry to the neutrons”), while delivering experimental data or analysed research outcomes to the industrial customer per their requirements. Improved tracking (and subsequent promotion) of outcomes is another priority, with impact evaluated not just on a financial basis, but acknowledging other metrics such as savings versus energy and raw materials.

Hitting the target on medical radioisotopes

Radioisotope production is a core function of nuclear research reactors. At the ILL, which delivers one of the highest neutron fluxes available within the neutron science community, the focus is on producing low-yield, neutron-rich radioisotopes – and especially nonconventional medical radioisotopes with applications in highly targeted radionuclide cancer therapy. Examples include 177-lutetium, which has been used in the treatment of over 1000 patients to date, and 161-terbium, currently in the preclinical trial phase. 

In 2021, ILL income from radioisotope production was close to €1 million, and plans are taking shape to increase production over the medium term. The ILL’s work in this area feeds into PRISMAP, an EU-funded initiative to develop an extensive infrastructure for nonconventional medical radioisotope production.

At the same time, so-called mediator companies are a growing – and increasingly vital – part of the mix. Operating at the interface between large-scale facilities and industry, these intermediary providers offer a broad portfolio of consultancy services – everything except the beam time – to enable industry customers to fast-track their R&D projects by easing access to the unique measurement capabilities offered by the big-science community. Examples include ANAXAM, a spin-off from the Paul Scherrer Institute in Switzerland, and Grenoble-based IROC Technologies – both of which are already connecting industry end-users with large-scale neutron facilities like ILL. Other companies – including Novitom in Grenoble and Finden in the UK – are facilitating industry research at synchrotron facilities like the ESRF, though could ultimately evolve to cover neutron techniques see “Prioritising the industry customer”.

Industry use of large-scale research facilities like ILL will remain on an upward trajectory for the foreseeable future

In the long term, ILL and other laboratories like it must focus on lowering the barriers to engage small and medium-sized enterprises, as well as established technology companies, such that they come to see Europe’s large-scale research facilities as a natural extension of their R&D and innovation pipeline. While indirect industry use of ILL will continue to grow, what constitutes success a decade from now would be an increase in the direct use of the facility by industry for precompetitive and proprietary research. Opportunity knocks. 

  • The authors would like to acknowledge the role of various colleagues in industry-related work at ILL: Duncan Atkins (ILU); Manon Letiche (IRT Nanoelec); Sandra Cabeza, Thilo Pirling, Ralf Schweins, Lionel Porcar, Alessandro Tengattini, Lukas Helfen (all instrument scientists at ILL); Ed Mitchell, Ennio Capria (both ESRF).

The post Neutron science: simplifying access for industry users appeared first on CERN Courier.

]]>
Feature The ILL is positioning neutron science as a natural extension of industry’s R&D and innovation pipeline. Caroline Boudou and Mark Johnson share the lessons for other large-scale facilities. https://cerncourier.com/wp-content/uploads/2022/09/CCSupp_BigScience_2022_Feature-Neutrons-1.jpg
CERN’s partnerships underpin a joined-up innovation pipeline https://cerncourier.com/a/cerns-partnerships-underpin-a-joined-up-innovation-pipeline/ Fri, 16 Sep 2022 10:45:37 +0000 https://preview-courier.web.cern.ch/?p=106690 From equipment procurement to knowledge-transfer initiatives, Giovanni Anelli, Anders Unnervik and Marzena Lapka provide a high-level tour of CERN’s unique innovation ecosystem.

The post CERN’s partnerships underpin a joined-up innovation pipeline appeared first on CERN Courier.

]]>
CERN sits at the epicentre of a diverse innovation ecosystem. Developing and implementing the enabling technologies for the laboratory’s particle accelerators, detectors and computing systems is only possible thanks to the sustained support of a global network of specialist industrial and institutional partners. Those applied R&D and product development collaborations come in many forms: from the upstream procurement of equipment and services across multiple industry supply chains to the structured transfer of CERN domain knowledge to create downstream growth opportunities for new and established technology companies. Emphasising the role of big science in delivering broad societal and economic impacts, the following snapshots showcase a technology innovation programme that is, quite simply, one on its own.   

Procurement: a world of opportunity

CERN is budgeted to spend CHF 2.5 billion (Euro 2.6 billion) on procurement of equipment and services for the period 2022–26 and is always looking to engage new industry suppliers. Contracts are awarded following price enquiries or invitations to tender. The former relate to contracts with an anticipated value below CHF 200,000 and are issued to a limited number of selected firms. Invitations to tender, meanwhile, are required for contracts with a value above CHF 200,000 and issued to firms qualified and selected based on a preceding open market survey. Prospective industry suppliers should visit https://procurement.web.cern.ch/ to register for CERN’s procurement database. 

Procurement: amplifying the upsides 

CERN’s research environment

For industry suppliers, the benefits of doing business with CERN go well beyond direct financial returns on a given contract. Like all big science projects, CERN provides fertile ground for technology innovation. As such, industry partners are also investing in future visibility and impact within their given commercial setting. CERN, after all, is well known for its technological excellence, which means preferred suppliers must, as standard, push the boundaries of what’s possible, creating a virtuous circle of positive impacts versus firms’ product innovation, sustainable practices, profitability and competitiveness. A 2018 research study, for example, investigated whether becoming a CERN supplier was linked to enhanced innovation performance within partner companies, and it showed a statistically significant correlation between CERN procurement contracts and corporate R&D, knowledge creation and commercial outcomes (see “Further reading”).

Following the procurement roadmap 

The LHC is undergoing a major upgrade to sustain and extend its discovery potential. Scheduled to enter operation in 2029, the High-Luminosity LHC (HL-LHC) project is a complex undertaking that requires at-scale industry engagement for all manner of technology innovations, whether that’s cutting-edge superconducting magnets or compact, ultraprecise superconducting RF cavities for beam rotation. What’s more, the machine’s enhanced luminosity (i.e. increased rate of collisions) will make new demands on the supporting vacuum, cryogenics and machine protection systems, while advanced concepts for collimation and diagnostics, beam modelling and beam-crossing schemes will also be required to maximise physics outputs. Industry is front-and-centre and has a pivotal role to play in delivering the core technologies needed to achieve the HL-LHC’s scientific goals. Down the line, even bigger opportunities will come into play as the HL-LHC draws to a close (in 2040 or thereabouts). Designs are now in the works for the proposed Future Circular Collider (FCC), an advanced research infrastructure that would push the energy and intensity frontiers of particle accelerators into uncharted territory, reaching collision energies of 100 TeV (versus the LHC’s current 13.6 TeV) in the search for new physics. 

The engine-room of knowledge transfer

Although fundamental physics might not seem the most obvious discipline in which to find emerging technologies with marketable applications, CERN’s unique research environment – reliant as it is on diverse types of radiation, extremely low temperatures, ultrahigh magnetic fields and high-voltage power systems – represents a rich source of innovation spanning particle accelerators, detectors and scientific computing. Industry partnerships underpin CERN’s core research endeavour through the procurement of specialist services and co-development of cutting-edge components, subsystems and instrumentation – a process known as upstream innovation. Conversely, companies looking to solve innovation challenges are able to tap CERN’s capabilities to support technology development and growth opportunities within their own R&D pipeline – a process known as downstream innovation. In this way, companies and research institutes collaborate with CERN scientists and engineers to deliver breakthrough technologies ranging from cancer therapy to environmental monitoring, radiation-hardened electronics to banking and finance. 

Knowledge transfer at CERN: unique technologies, unprecedented performance

The applied R&D and technology advances that underpin CERN’s scientific mission are a rich source of product innovation for companies working across multiple industry sectors. Industry collaborations with CERN scientists and engineers – including the projects below – are overseen by the laboratory’s Knowledge Transfer Group.     

Next-generation radiotherapy

Compact Linear Collider

An R&D collaboration involving scientists from CERN and Lausanne University Hospital (CHUV) seeks to fast-track the development of a next-generation radiotherapy modality that will exploit very-high-energy electron (VHEE) beams to treat cancer patients. A dedicated VHEE facility, based at CHUV, will exploit the so-called FLASH effect to deliver high-dose VHEE radiation over short time periods (less than 200 ms) to destroy deep-seated tumours while minimising collateral damage to adjacent healthy tissue and organs at risk. The pioneering treatment system is based on the high-gradient accelerator technology developed for the proposed CLIC electron–positron collider at CERN. Teams from CHUV and its research partners have been performing preclinical studies related to VHEE and FLASH at the CERN Linear Electron Accelerator for Research (CLEAR), one of the few facilities available for characterising VHEE beams. 

The future of transportation

Self-driving vehicles

CERN has unique capabilities in real-time data processing. When beams of particles collide at the centre of a particle detector, new particles fly out in all directions. Different detector systems, arranged in layers around the collision point, use a range of techniques to identify the resulting particles, generating an enormous flow of data. Similar challenges apply to the development of autonomous vehicles, with the need for rapid interpretation of a multitude of real-time data streams generated under normal driving conditions. Zenseact, owned primarily by Volvo Cars, worked with CERN scientists to optimise machine learning algorithms, originally developed to support LHC data acquisition and analysis, for collision-avoidance scenarios in next-generation autonomous vehicles.

Sustainability and energy-efficiency

CERN’s cooling and ventilation infrastructure

Another high-profile innovation partner for CERN is ABB Motion, a technology leader in digitally enabled motor and drive solutions to support a low-carbon future for industry, infrastructure and transportation. The partnership has been launched to optimise the laboratory’s cooling and ventilation infrastructure, with the aim of reducing energy consumption across the campus. Specifically, CERN’s cooling and ventilation system will be equipped with smart sensors, which convert traditional motors, pumps, mounted bearings and gearing into smart, wirelessly connected devices. These devices will collect data that will be used to develop “digital twins” of selected cooling and ventilation units, allowing for the creation of energy-saving scenarios. Longer term, the plan is to disseminate the project learning publicly, so that industry and large-scale research facilities can apply best practice on energy-efficiency.

Intellectual property: getting creative  

A curated portfolio of intellectual property (IP) policies provides the framework for transferring CERN’s applied R&D and technology know-how to industry and institutional partners. Democratisation is the driver here, whatever the use-case. Many of the organisation’s projects, for example, are available via CERN’s Open Hardware Repository under the CERN Open Hardware Licence, offering a large user community the chance to transform prototype products and services into tangible commercial opportunities. CERN also encourages the creation of new spin-offs – companies based, partially or wholly, on CERN technologies – and supports such ventures with a dedicated IP policy. Custom licensing opportunities are available for more established start-up businesses seeking to apply CERN technologies within an existing product development programme.

Innovation partnerships: a call to action 

The Knowledge Transfer team at CERN is exploring a range of innovation partnerships across applied disciplines as diverse as energy and environment, healthcare, quantum science, machine learning and AI, and aerospace engineering. The unifying theme: to translate CERN domain knowledge and enabling technologies into broader societal and economic impacts. Companies should visit CERN’s Knowledge Transfer website (https://kt.cern/) to learn more about partnership opportunities, including R&D collaborations; technology licensing; services and consultancy; and starting up a new business based on CERN technology.  

IdeaSquare: networking young innovators

IdeaSquare is CERN’s platform for early-stage collaborations between students, scientists, other CERN personnel and relevant organisations working across multiple disciplines. The initiative operates at what it calls the “fuzzy front end” of the R&D and innovation process and seeks to “trigger transformations in the way we think about societal challenges and…identify solutions that will have a real impact on people’s lives”. In this way, IdeaSquare ties science innovation at CERN to the UN’s Sustainable Development Goals, engaging young innovators in the CERN Entrepreneurship Student Programme (CESP), for example, or Challenge-Based Innovation (CBI). Other activities include selected EU R&D projects; prototyping and innovation workshops; as well as international educational programmes. Prospective partners should visit https://ideasquare.cern and https://kt.cern/cesp for more information about the latest opportunities. 

The post CERN’s partnerships underpin a joined-up innovation pipeline appeared first on CERN Courier.

]]>
Feature From equipment procurement to knowledge-transfer initiatives, Giovanni Anelli, Anders Unnervik and Marzena Lapka provide a high-level tour of CERN’s unique innovation ecosystem. https://cerncourier.com/wp-content/uploads/2022/09/CCSupp_BigScience_2022_Feature-Innovation-CERN-LHC.jpg
Building bridges with industry https://cerncourier.com/a/building-bridges-with-industry/ Fri, 16 Sep 2022 10:45:29 +0000 https://preview-courier.web.cern.ch/?p=106725 The European Synchrotron Radiation Facility (ESRF) enables industry scientists to translate their materials R&D into high-impact technologies.

The post Building bridges with industry appeared first on CERN Courier.

]]>
The European Synchrotron Radiation Facility (ESRF) in Grenoble, France, is among an elite class of fourth-generation advanced light sources – an X-ray “super-microscope” that enables researchers to illuminate the structure and behaviour of matter at the atomic and molecular level. As such, the ESRF’s synchrotron beamlines offer leading-edge materials characterisation capabilities for applied scientists and engineers to address research challenges at all stage of the innovation life cycle – from product development and manufacturing through operational studies related to ageing, wear-and-tear, restoration and recycling. Here CERN Courier talks to Ed Mitchell, head of business development at the ESRF, about the laboratory’s evolving relationship with the industrial R&D community.

What does your role involve as head of business development?   

I lead a core team of seven staff looking after the ESRF’s engagement with industry – though not the procurement of equipment and services. It’s a broad-scope remit, covering industry as a user of the facility as well as technology transfer projects and R&D collaborations with industry partners as they arise. The business development activity increasingly dovetails with the outreach efforts of leading research technology organisations – the Fraunhofer institutes in Germany, for example, and the the French Alternative Energies and Atomic Energy Commission (CEA) in France – which have extensive networks and amplify ESRF’s engagement with industry at the regional and national level. 

Ed Mitchell

The business development office is also responsible for identifying – and securing – strategic European Union (EU) grant opportunities. A case in point is InnovaXN, a joint PhD programme with the Institut Laue-Langevin (ILL), a neutron science facility here in Grenoble, and a ground-breaking approach to working with industry partners (see “Neutron science: simplifying access for industry users“). STREAMLINE is another of our important EU-funded projects (under Horizon 2020) and supports the recent ESRF-EBS (Extremely Brilliant Source) upgrade with new-look operation, access and automation procedures on several beamlines.  

How does your team engage new industry users and partners for the ESRF? 

Initiating and developing new industry contacts is a big part of what we do, though the challenge is always to talk to companies on their own terms, so that they understand the extent of the opportunities available at ESRF. A related issue is getting to the right people, especially in multinational companies with extensive R&D programmes. Sometimes we get lucky. At BASF, for example, we work closely with a senior applied research manager, someone who knows ESRF well having had links with us for many years. He’s an amazing contact, though the exception rather than the rule when it comes to industry engagement. 

What about the ESRF’s outreach efforts with small and medium-sized enterprises (SMEs)? 

There is EU funding available to help SMEs work with the ESRF and other advanced light sources in Europe. While this is relatively modest support, it is critical as a way of de-risking that first access for cash-strapped SMEs when they approach the big-science community. We need more of this support to scale our engagement with SMEs. Operationally, the so-called mediator companies are also incredibly important for bridging the gap to SMEs – as well as larger companies – helping them to plan, execute and deliver high-end materials characterisation services for their industrial problem-solving. It’s worth adding that the mediator companies offer value-added analysis of experimental results for research studies where we do not have the niche expertise – for example, petrochemical catalysis or the testing of consumer products (see “Prioritising the industry customer“).

So the mediator companies are one of the key elements of the ESRF’s engagement with industry?

Correct. I get a little frustrated when people imply that the mediators are simply making money off the back of the large-scale facilities. Mediator companies are another wholly valid element of the big-science ecosystem and should be celebrated as such. They add niche value, generate jobs and amplify the marketing and business development efforts of ESRF (and facilities like it) with prospective industry users. Their role is wholly positive. Entrepreneurs have seen a space, been innovative, and they’re making a living along the way. It’s a win–win. 

How is the industry user base at ESRF evolving?

A substantial majority of our commercial users used to be from the pharmaceutical sector, using structural biology for drug discovery. The pharma researchers are still there, but over the last decade the industry community has become more diverse, covering more industry sectors and using a broader portfolio of synchrotron techniques. What’s more, a lot of industry users are not – and don’t aspire to be – experts in synchrotron science. Instead, they just want access to the facility for what we might consider routine measurements rather than cutting-edge research. 

The ESRF

Those routine measurements – billed internally as “access to excellence everyday” – are only possible thanks to the specific qualities of a light source like the ESRF, with our science and technology experts working with industry to make such services more accessible and more automated. On the horizon, we can also see interest in some level of standard operating procedure for various industry use-cases, so that quality can be assured – though this will need to be considered within the context of facilities whose main mission is academic research.   

What steps can you take to remain aligned with industry’s changing requirements? 

Our task is to go out and listen to industry researchers and design the services they need for what they want to do – not what we think they might want. A case study in this regard is our collaboration with BASF in which ESRF and BASF scientists are co-developing a high-throughput mail-in service to support X-ray powder diffraction studies of hundreds of samples per shipment from the client’s R&D lab. This is essentially chemistry genomics, with the synchrotron beamlines providing automated and high-resolution studies of materials destined for applications in next-generation batteries, catalysts and the like. We hope to see more co-designed services being built with other companies very soon. 

What about tracking the impact of industry research conducted at ESRF? 

This is always tricky. More often than not, the downstream impact of confidential industry R&D conducted at ESRF remains hidden even from our view. After all, companies are unlikely to reveal how much money they saved on their manufacturing process, for example, or whether a new product was an indirect or direct result of X-ray studies at our beamlines. 

In some ways, the laboratory needs perhaps just one killer quantifiable result every 10 years – think multibillion euro outcomes for industry – and the ESRF could be thought of as having justified its existence. Of course, this ignores the longer-term impact of the fundamental science conducted by academics – far and away the main user community at the ESRF. The bottom line: industry clients come back, they pay for access, so one has to assume that there is significant business impact for them. 

The post Building bridges with industry appeared first on CERN Courier.

]]>
Opinion The European Synchrotron Radiation Facility (ESRF) enables industry scientists to translate their materials R&D into high-impact technologies. https://cerncourier.com/wp-content/uploads/2022/09/CCSupp_BigScience_2022_Feature-ESRF-1.jpg
DESY’s innovation ecosystem delivers impact for industry https://cerncourier.com/a/desys-innovation-ecosystem-delivers-impact-for-industry/ Fri, 16 Sep 2022 10:45:19 +0000 https://preview-courier.web.cern.ch/?p=106705 Djamschid Safi explains how big science is helping European companies to fast-track the development of innovative products, services and applications.

The post DESY’s innovation ecosystem delivers impact for industry appeared first on CERN Courier.

]]>
Collaboration, applied research services and innovation networks: these are the reference points of an evolving business development strategy that’s building bridges between DESY’s large-scale research infrastructure and end-users across European industry. The goal: to open up the laboratory’s mission in basic science to support technology innovation and, by extension, deliver at-scale economic and societal impact. 

As a German national laboratory rooted in physics, and one of the world’s leading accelerator research centres, DESY’s scientific endeavours are organised along four main coordinates: particle physics, photon science, astroparticle physics and the accelerator physics division. Those parallel lines of enquiry, pursued jointly with an established network of regional and international partners, make DESY a magnet for more than 3000 guest scientists from over 40 countries every year. 

In the same way, the laboratory is a coveted research partner for industry and business, its leading-edge experimental facilities offering a unique addition to the R&D pipeline of Europe’s small and medium-sized enterprises as well as established technology companies. 

Technology transfer pathways

Industry collaboration with DESY is nothing if not diverse, spanning applied R&D and innovation initiatives across topics such as compact next-generation accelerator technologies, advanced laser systems for quality control in semiconductor-chip production, and the 3D printing of custom resins to create parts for use in ultrahigh-vacuum environments. While such cooperative efforts often involve established companies from many different industries, partners from academic science play an equally significant role – and typically with start-up or technology transfer ambitions as part of the mix. 

This is the case for an envisaged spin-off project in which scientists from the University of Hamburg and DESY are working together on a portable liquid biopsy device for medical diagnostics applications (for example, in cancer screening and treatment evaluation). Reciprocity is the key to success here: the university researchers bring their background in nanoanalytics to the project, while DESY physicists contribute deep domain knowledge and expertise on the development of advanced detector technologies for particle physics experiments. As a result, a prototype test station for high-sensitivity in-situ analysis is now in the works, with the interaction of the nanochannels and the detector in the test device representing a significant R&D challenge in terms of precision mechanics (while the DESY team also provides expertise in pattern recognition to accelerate the readout of test results).

Elsewhere, DESY’s MicroTCA Technology Lab (TechLab) represents a prominent case study of direct industry collaboration, fostering the uptake of the MicroTCA.4 open electronics standard for applications in research and industry. Originally developed for the telecommunications market, the standard was subsequently adapted by DESY and its network of industrial partners – among them NAT (Germany), Struck (Germany) and CAENels (Italy) – for deployment within particle accelerator control systems (enabling precision measurements of many analogue signals with simultaneous high-performance digital processing in a single controller). 

DESY’s MicroTCA Technology Lab

As such, MicroTCA.4 provides a core enabling technology in the control systems of the European X-ray Free Electron Laser (European XFEL), which runs over a 3.4 km span from DESY’s Hamburg location to the main European XFEL campus in the town of Schenefeld. Another bespoke application of the standard is to be found in the ground-based control centre of the Laser Interferometer Space Antenna (LISA), a space-based gravitational wave detector jointly developed by NASA and the European Space Agency. 

Underpinning this technology transfer success is a parallel emphasis on knowledge transfer and education. This is the reason why TechLab, which sits as part of the business development office at DESY, offers a programme of annual workshops and seminars for current and prospective users of MicroTCA.4. The long-term vision, moreover, is to develop a commercially self-sustaining TechLab spin-off based on the development and dissemination of MicroTCA into new applications and markets. 

Industry services to fast-track innovation

One of the principal drivers of industrial engagement with DESY is the laboratory’s PETRA III synchrotron light source (comprising a 2.3 km-circumference storage ring and 25 experimental beamlines). This high-brilliance X-ray facility is exploited by academic and industrial scientists to shed light on the structure and behaviour of matter at the atomic and molecular level across a range of disciplines – from clean-energy technologies to drug development and healthcare, from structural biology and nanotech to food and agricultural sciences.   

DESY photon scientists and engineers ensure that industrial users are in a position to maximise the return on their PETRA III beam time

Operationally, DESY photon scientists and engineers ensure that industrial users are in a position to maximise the return on their PETRA III beam time, offering a portfolio of services that includes feasibility studies, sample preparation, execution of measurements, as well as downstream analysis of experimental data. Industry customers can request proprietary access to PETRA III via the business development office (under a non-disclosure agreement if necessary), while clients do not even need to come to DESY themselves, with options for a mail-in sample service or even remote access studies in certain circumstances. Publication is not required under this access model, though discounted fees are available to industry users willing to publish a “success story” or scientific paper in partnership with DESY. 

Magnetron-sputtering deposition system

Alongside the proprietary route, companies are able to access PETRA III beam time through academic partners. This pathway is free of charge and based on research proposals with strong scientific or socioeconomic impact (established via a competitive review process), with a requirement that results are subsequently published in the formal scientific literature. Notwithstanding the possibilities offered by DESY itself, the laboratory’s on-campus partners – namely the Helmholtz Centre Hereon and the European Molecular Biology Laboratory (EMBL) – use PETRA III to deliver a suite of dedicated services that help industry customers address diverse problems in the applied R&D pipeline. 

The German biotech company BioNTech is a case in point. Best known for its successful mRNA vaccine against SARS-Cov-2 infections, BioNTech conducted a programme of X-ray scattering experiments at EMBL’s PETRA III beamline P12. The results of these investigations are now helping the company’s scientists to better package mRNA within nanoparticles for experimental vaccines and drugs. Along an altogether different R&D track, PETRA III has helped industry users gain novel insights into the inner life of conventional AAA batteries – studies that could ultimately lead to products with significantly extended lifetimes. Using non-invasive X-ray diffraction computer tomography, applied under time-resolved conditions, the studies revealed aspects of the battery ageing process by examining phase transformations in the electrodes and electrolyte during charging and discharge. 

Building an innovation ecosystem 

While access to DESY’s front-line experimental facilities represents a game-changer for many industry customers, the realisation of new commercial products and technologies does not happen in a vacuum. Innovators, for their part, need specialist resources and expert networks to bring their ideas to life – whether that’s in the form of direct investment, strategic consultancy, business and entrepreneurship education, or access to state-of-the-art laboratories and workshops for prototyping, testing, metrology and early-stage product qualification. 

PETRA III beam time

DESY is single-minded in its support for this wider “innovation ecosystem”, with a range of complementary initiatives to encourage knowledge exchange and collaboration among early-career scientists, entrepreneurs and senior managers and engineers in established technology companies. The DESY Start-up Office, for example, offers new technology businesses access to a range of services, including management consultancy, business plan development and networking opportunities with potential suppliers and customers. There’s also the Start-up Labs Bahrenfeld, an innovation centre and technology incubator on the DESY Hamburg campus that provides laboratory and office space to young technology companies. The incubator’s current portfolio of 16 start-ups reflects DESY’s pre-eminence in lasers, detectors and enabling photonic technologies, with seven of the companies also targeting applications in the life sciences. 

A more focused initiative is the CAROTS 2.0 Startup School, which provides scientists with the core competencies for running their own scientific service companies (intermediary providers of analytical research services to help industry make greater use of large-scale science facilities like DESY). Longer term, the DESY Innovation Factory is set to open in 2025, creating an ambitious vehicle for the commercial development of novel ideas in advanced materials and the life sciences, while fostering cooperation between the research community and technology companies in various growth phases. There will be two locations, one on the DESY campus and one in the nearby innovation and technology park Vorhornweg.

Basic science, applied opportunities

If the network effects of DESY’s innovation ecosystem are a key enabler of technology transfer and industry engagement, so too is the relentless evolution of the laboratory’s accelerator R&D programme. Consider the rapid advances in compact plasma-based accelerators, offering field strengths in the GV/m regime and the prospect of a paradigm shift to a new generation of user-friendly particle accelerators – even potentially “bringing the accelerator to the problem” for specific applications. With a dedicated team working on the miniaturisation of particle accelerators, DESY is intent on maturing plasma technologies for its core areas of expertise in particle physics and photon science while simultaneously targeting medical and industrial use-cases from the outset.

A laser plasma subsystem under vacuum

Meanwhile, plans are taking shape for PETRA IV and conversion of the PETRA storage ring into an ultralow-emittance synchrotron source. By generating beams of hard X-rays with unprecedented coherence properties that can be focused down to the nm regime, PETRA IV will provide scientists and engineers with the ultimate 3D process microscope for all manner of industry-relevant problems – whether that’s addressing individual organelles in living cells, following metabolic pathways with elemental and molecular specificity, or observing correlations in functional materials over mm length scales and under working conditions. 

Fundamental science never stops at DESY. Neither, it seems, do the downstream opportunities for industrial collaboration and technology innovation.

The post DESY’s innovation ecosystem delivers impact for industry appeared first on CERN Courier.

]]>
Feature Djamschid Safi explains how big science is helping European companies to fast-track the development of innovative products, services and applications. https://cerncourier.com/wp-content/uploads/2022/09/CCSupp_BigScience_2022_Feature-DESY_1.jpg
From atomic to nuclear clocks https://cerncourier.com/a/from-atomic-to-nuclear-clocks/ Mon, 05 Sep 2022 09:13:29 +0000 https://preview-courier.web.cern.ch/?p=105691 Recent progress in understanding thorium’s nuclear structure could enable an ultra-accurate nuclear clock with applications in fundamental physics.

The post From atomic to nuclear clocks appeared first on CERN Courier.

]]>
Artist’s rendition of a nuclear optical clock

For the past 60 years, the second has been defined in terms of atomic transitions between two hyperfine states of caesium-133. Such transitions, which correspond to radiation in the microwave regime, enable state-of-the art atomic clocks to keep time at the level of one second in more than 300 million years. A newer breed of optical clocks developed since the 2000s exploit frequencies that are about 105 times higher. While still under development, optical clocks based on aluminium ions are already reaching accuracies of about one second in 33 billion years, corresponding to a relative systematic frequency uncertainty below 1 × 10–18. 

To further reduce these uncertainties, in 2003 Ekkehard Peik and Christian Tamm of Physikalisch-Technische Bundesanstalt in Germany proposed the use of a nuclear instead of atomic transition for time measurements. Due to the small nuclear moments (corresponding to the vastly different dimensions of atoms and nuclei), and thus the very weak coupling to perturbing electromagnetic fields, a “nuclear clock” is less vulnerable to external perturbations. In addition to enabling a more accurate timepiece, this offers the potential for nuclear clocks to be used as quantum sensors to test fundamental physics. 

Clockwork 

A clock typically consists of an oscillator and a frequency-counting device. In a nuclear clock (see “Nuclear clock schematic” figure), the oscillator is provided by the frequency of a transition between two nuclear states (in contrast to a transition between two states in the electronic shell in the case of an atomic clock). For the frequency-counting device, a narrow-band laser resonantly excites the nuclear-clock transition, while the corresponding oscillations of the laser light are counted using a frequency comb. This device (the invention of which was recognised by the 2005 Nobel Prize in Physics) is a laser source whose spectrum consists of a series of discrete, equally spaced frequency lines. After a certain number of oscillations, given by the frequency of the nuclear transition, one second has elapsed. 

Nuclear clock schematic

The need for direct laser excitation strongly constrains applicable nuclear-clock transitions: their energy has to be low enough to be accessible with existing laser technology, while simultaneously exhibiting a narrow linewidth. As the linewidth is determined by the lifetime of the excited nuclear state, the latter has to be long enough to allow for highly stable clock operation. So far, only the metastable (isomeric) first excited state of 229Th, denoted 229mTh, qualifies as a candidate for a nuclear clock, due to its exceptionally low excitation energy. 

The existence of the isomeric state was conjectured in 1976 from gamma-ray spectroscopy of 229Th, and its excitation energy has only recently been determined to be 8.19 ± 0.12 eV (corresponding to a vacuum-ultraviolet wavelength of 151.4 ± 2.2 nm). Not only is it the lowest nuclear excitation among the roughly 184,000 excited states of the 3300 or so known nuclides, its expected lifetime is of the order of 1000 s, resulting in an extremely narrow relative linewidth (ΔE/E ~ 10–20) for its ground-state transition (see “Unique transition” figure). Besides high resilience against external perturbations, this represents another attractive property for a thorium nuclear clock. 

Networks of ultra-precise synchronised nuclear clocks could enable a search for ultra light dark matter

Achieving optical control of the nuclear transition via a direct laser excitation would open a broad range of applications. A nuclear clock’s sensitivity to the gravitational redshift, which causes a clock’s relative frequency to change depending on its absolute height, could enable more accurate global positioning systems and high-sensitivity detections of fluctuations of Earth’s gravitational potential induced by seismic or tectonic activities. Furthermore, while the few-eV thorium transition emerges from a fortunate near-degeneracy of the two lowest nuclear-energy levels in 229Th, the Coulomb and strong-force contributions to these energies differ at the MeV level. This makes the nuclear-level structure of 229Th uniquely sensitive to variations of fundamental constants and ultralight dark matter. Many theories predict variations of the fine structure constant, for example, but on tiny yearly rates. The high sensitivity provided by the thorium isomer could allow such variations to be identified. Moreover, networks of ultra-precise synchronised clocks could enable a search for (ultra light) dark-matter signals. 

Two different approaches have been proposed to realise a nuclear clock: one based on trapped ions and another using doped solid-state crystals. The first approach starts from individually trapped Th ions, which promises an unprecedented suppression of systematic clock-frequency shift and leads to an expected relative clock accuracy of about 1 × 10–19. The other approach relies on embedding 229Th atoms in a vacuum–ultraviolet (VUV) transparent crystal such as CaF2. This has the advantage of a large concentration (> 1015/cm3) of Th nuclei in the crystal, leading to a considerably higher signal-to-noise ratio and thus a greater clock stability. 

Precise characterisation 

A precise characterisation of the thorium isomer’s properties is a prerequisite for any kind of nuclear clock. In 2016 the present authors and colleagues made the
first direct identification of
229mTh by detecting electrons emitted from its dominant decay mode: internal-conversion (IC), whereby a nuclear excited state decays by the direct emission of one of its atomic electrons (see “Isomeric signal” figure). This brought the long-term objective of a nuclear clock into the focus of international research. 

Currently, experimental access to 229mTh is possible only via radioactive decays of heavier isotopes or by X-ray pumping from higher-lying rotational nuclear levels, as shown by Takahiko Masuda and co-workers in 2019. The former, based on the alpha decay of 233U (2% branching ratio), is the most commonly used approach. Very recently, however, a promising new experiment exploiting β decay from 229Ac was performed at CERN’s ISOLDE facility led by a team at KU Leuven. Here, 229Ac is online-produced and mass-separated before being implanted into a large-bandgap VUV-transparent crystal. In both population schemes, either photons or conversion electrons emitted during the isomeric decay are detected. 

Detection of the isomer’s decay

In the IC-based approach, a positively charged 229mTh ion beam is generated from alpha-decay daughter products recoiling off a 233U source placed inside a buffer-gas stopping cell. The decay products are thermalised, guided by electrical fields towards an exit nozzle, extracted into a longitudinally 15-fold segmented radiofrequency quadrupole (RFQ) that acts as an ion guide, phase-space cooler and optionally a beam buncher, followed by a quadrupole mass separator for beam purification. In charged thorium isomers, the otherwise dominant IC decay branch is energetically forbidden, leading to a prolongation of the lifetime by up to nine orders of magnitude. 

Operating the segmented RFQ as a linear Paul trap to generate sharp ion pulses enables the half-life of the thorium isomer to be determined. In work performed by the present authors in 2017, pulsed ions from the RFQ were collected and neutralised on a metal surface, triggering their IC decay. Since the long ionic lifetime was inaccessible due to the limited ion-storage time imposed by the trap’s vacuum conditions, the drastically reduced lifetime of neutral isomers was targeted. Time-resolved detection of the low-energy conversion electrons determined the lifetime to be 7 ± 1 μs. 

Excitation energy

Recently, considerable progress has been made in determining the 229mTh excitation energy – a milestone en route to a nuclear clock. In general, experimental approaches to determine the excitation energy fall into three categories: indirect measurements via gamma-ray spectroscopy of energetically low-lying rotational transitions in 229Th; direct spectroscopy of fluorescence photons emitted in radiative decays; and via electrons emitted in the IC decay of neutral 229mTh. The first approach led to the conjecture of the isomer’s existence and finally, in 2007, to the long-accepted value of 7.6 ± 0.5 eV. The second approach tries to measure the energy of photons emitted directly in the ground-state decay of the thorium isomer. 

Isomeric nuclear levels

The first direct measurement of the thorium isomer’s excitation energy was reported by the present authors and co-workers in 2019. Using a compact magnetic-bottle spectrometer equipped with a repulsive electrostatic potential, followed by a microchannel-plate detector, the kinetic energy of the IC electrons emitted after an in-flight neutralisation of Th ions emitted from a 233U source could be determined. The experiment provided a value for the excitation energy of the nuclear-clock transition of 8.28 ± 0.17 eV. At around the same time in Japan, Masuda and co-workers used synchrotron radiation to achieve the first population of the isomer via resonant X-ray pumping into the second excited nuclear state of 229Th at 29.19 keV, which decays predominantly into 229mTh. By combining their measurement with earlier published gamma-spectroscopic data, the team could constrain the isomeric excitation energy to the range 2.5–8.9 eV. More recently, led by teams at Heidelberg and Vienna, the excited isomers were implanted into the absorber of a custom-built cryogenic magnetic micro-calorimeter and the isomeric energy was measured by detecting the temperature-induced change of the magnetisation using SQUIDs. This produced a value of 8.10 ± 0.17 eV for the clock-transition energy, resulting in a world-average of 8.19 ± 0.12 eV. 

Besides precise knowledge of the excitation energy, another prerequisite for a nuclear clock is the possibility to monitor the nuclear excitation on short timescales. Peik and Tamm proposed a method to do this in 2003 based on the “double resonance” principle, which requires knowledge of the hyperfine structure of the thorium isomer. Therefore, in 2018, two different laser beams were collinearly superimposed on the 229Th ion beam, initiating a two-step excitation in the atomic shell of 229Th. By varying both laser frequencies, resonant excitations of hyperfine components both of the 229Th ground state and the 229mTh isomer could be identified and thus the hyperfine splitting signature of both states could be established by detecting their de-excitation (see “Hyperfine splitting” figure). The eventual observation of the 229mTh hyperfine structure in 2018 not only will in the future allow a non-destructive verification of the nuclear excitation, but enabled the isomer’s magnetic dipole and electrical quadrupole moments, and the mean-square charge radius, to be determined. 

Roadmap towards a nuclear clock

So far, the identification and characterisation of the thorium isomer has largely been driven by nuclear physics, where techniques such as gamma spectroscopy, conversion-electron spectroscopy and radioactive decays offer a description in units of electron volts. Now the challenge is to refine our knowledge of the isomeric excitation energy with laser-spectroscopic precision to enable optical control of the nuclear-clock transition. This requires bridging a gap of about 12 orders of magnitude in the precision of the 229mTh excitation energy, from around 0.1 eV to the sub-kHz regime. In a first step, existing broad-band laser technology can be used to localise the nuclear resonance with an accuracy of about 1 GHz. In a second step, using VUV frequency-comb spectroscopy presently under development, it is envisaged to improve the accuracy into the (sub-)kHz range. 

Hyperfine splitting

Another practical challenge when designing a high-precision ion-trap-based nuclear clock is the generation of thermally decoupled, ultra-cold 229Th ions via laser cooling. 229Th3+ is particularly suited due to its electronic level structure, with only one valence electron. Due to the high chemical reactivity of thorium, a cryogenic Paul trap is the ideal environment for laser cooling, since almost all residual gas atoms will freeze out at 4 K, increasing the trapping time into the region of a few hours. This will form the basis for direct laser excitation of 229mTh and will also enable a measurement of the not yet experimentally determined isomeric lifetime of 229Th ions. For the alternative development of a compact solid-state nuclear clock it will be necessary to suppress the 229mTh decay via internal conversion in a large band-gap, VUV transparent crystal and to detect the γ decay of the excited nuclear state. Proof-of-principle studies of this approach are currently ongoing at ISOLDE. 

Laser-spectroscopy activities on the thorium isomer are also ongoing in the US, for example at JILA, NIST and UCLA

Many of the recent breakthroughs in understanding the 229Th clock transition emerged from the European Union project “nuClock”, which terminated in 2019. A subsequent project, ThoriumNuclearClock (ThNC), aims to demonstrate at least one nuclear clock by 2026. Laser-spectroscopy activities on the thorium isomer are also ongoing in the US, for example at JILA, NIST and UCLA. 

In view of the large progress in recent years and ongoing worldwide efforts both experimentally and theoretically, the road is paved towards the first nuclear clock. It will complement highly precise optical atomic clocks, while in some areas, in the long run, nuclear clocks might even have the potential to replace them. Moreover, and beyond its superb timekeeping capabilities, a nuclear clock is a unique type of quantum sensor allowing for fundamental physics tests, from the variation of fundamental constants to searches for dark matter.

The post From atomic to nuclear clocks appeared first on CERN Courier.

]]>
Feature Recent progress in understanding thorium’s nuclear structure could enable an ultra-accurate nuclear clock with applications in fundamental physics. https://cerncourier.com/wp-content/uploads/2022/08/CCSepOct22_Clocks_feature.jpg
Tracing molecules at the vacuum frontier https://cerncourier.com/a/tracing-molecules-at-the-vacuum-frontier/ Mon, 05 Sep 2022 09:11:50 +0000 https://preview-courier.web.cern.ch/?p=105801 The CERN-developed simulator “Molflow” has become the de-facto industry standard for ultra-high-vacuum simulations, with applications ranging from chip manufacturing to the exploration of the Martian surface.

The post Tracing molecules at the vacuum frontier appeared first on CERN Courier.

]]>
Thermal-radiation calculation

In particle accelerators, large vacuum systems guarantee that the beams travel as freely as possible. Despite being one 25-trillionth the density of Earth’s atmosphere, however, a tiny concentration of gas molecules remain. These pose a problem: their collisions with accelerated particles reduce the beam lifetime and induce instabilities. It is therefore vital, from the early design stage, to plan efficient vacuum systems and predict residual pressure profiles.

Surprisingly, it is almost impossible to find commercial software that can carry out the underlying vacuum calculations. Since the background pressure in accelerators (of the order 10–9–10–12 mbar) is so low, molecules rarely collide with one other and thus the results of codes based on computational fluid dynamics aren’t valid. Although workarounds exist (solving vacuum equations analytically, modelling a vacuum system as an electrical circuit, or taking advantage of similarities between ultra-high-vacuum and thermal radiation), a CERN-developed simulator “Molflow”, for molecular flow, has become the de-facto industry standard for ultra-high-vacuum simulations.

Instead of trying to analytically solve the surprisingly difficult gas behaviour over a large system in one step, Molflow is based on the so-called test-particle Monte Carlo method. In a nutshell: if the geometry is known, a single test particle is created at a gas source and “bounced” through the system until it reaches a pump. Then, repeating this millions of times, with each bounce happening in a random direction, just like in the real world, the program can calculate the hit-density anywhere, from which the pressure is obtained.

The idea for Molflow emerged in 1988 when the author (RK) visited CERN to discuss the design of the Elettra light source with CERN vacuum experts (see “From CERN to Elettra, ESRF, ITER and back” panel). Back then, few people could have foreseen the numerous applications outside particle physics that it would have. Today, Molflow is used in applications ranging from chip manufacturing to the exploration of the Martian surface, with more than 1000 users worldwide and many more downloads from the dedicated website.

Molflow in space 

While at CERN we naturally associate ultra-high vacuum with particle accelerators, there is another domain where operating pressures are extremely low: space. In 2017, after first meeting at a conference, a group from German satellite manufacturer OHB visited the CERN vacuum group, interested to see our chemistry lab and the cleaning process applied to vacuum components. We also demoed Molflow for vacuum simulations. It turned out that they were actively looking for a modelling tool that could simulate specific molecular-contamination transport phenomena for their satellites, since the industrial code they were using had very limited capabilities and was not open-source. 

Molflow has complemented NASA JPL codes to estimate the return flux during a series of planned fly-bys around Jupiter’s moon Europa

A high-quality, clean mirror for a space telescope, for example, must spend up to two weeks encapsulated in the closed fairing from launch until it is deployed in orbit. During this time, without careful prediction and mitigation, certain volatile compounds (such as adhesive used on heating elements) present within the spacecraft can find their way to and become deposited on optical elements, reducing their reflectivity and performance. It is therefore necessary to calculate the probability that molecules migrate from a certain location, through several bounces, and end up on optical components. Whereas this is straightforward when all simulation parameters are static, adding chemical processes and molecule accumulation on surfaces required custom development. Even though Molflow could not handle these processes “out of the box”, the OHB team was able to use it as a basis that could be built on, saving the effort of creating the graphical user interface and the ray-tracing parts from scratch. With the help of CERN’s knowledge-transfer team, a collaboration was established with the Technical University of Munich: a “fork” in the code was created; new physical processes specific to their application were added; and the code was also adapted to run on computer clusters. The work was made publicly available in 2018, when Molflow became open source.

From CERN to Elettra, ESRF, ITER and back

Molflow simulation

Molflow emerged in 1988 during a visit to CERN from its original author (RK), who was working at the Elettra light source in Trieste at the time. CERN vacuum expert Alberto Pace showed him a computer code written in Fortran that enabled the trajectories of particles to be calculated, via a technique called ray tracing. On returning to Trieste, and realising that the CERN code couldn’t be run there due to hardware and software incompatibilities, RK decided to rewrite it from scratch. Three years later the code was formally released. Once more, credit must be given to CERN for having been the birthplace of new ideas for other laboratories to develop their own applications.

Molflow was originally written in Turbo Pascal, had (black and white) graphics, and visualised geometries in 3D – even allowing basic geometry editing and pressure plots. While today such features are found in every simulator, at the time the code stood out and was used in the design of several accelerator facilities, including the Diamond Light Source, Spallation Neutron Source, Elettra, Alba and others – as well as for the analysis of a gas-jet experiment for the PANDA experiment at GSI Darmstadt. That said, the early code had its limitations. For example, the upper limit of user memory (640 kB for MS-DOS) significantly limited the number of polygons used to describe the geometry, and it was single-processor. 

In 2007 the original code was given a new lease of life at the European Synchrotron Radiation Facility in Grenoble, where RK had moved as head of the vacuum group. Ported to C++, multi-processor capability was added, which is particularly suitable for Monte Carlo calculations: if you have eight CPU cores, for example, you can trace eight molecules at the same time. OpenGL (Open Graphics Library) acceleration made the visualisation very fast even for large structures, allowing the usual camera controls of CAD editors to be added. Between 2009 and 2011 Molflow was used at ITER, again following its original author, for the design and analysis of vacuum components for the international tokamak project.

In 2012 the project was resumed at CERN, where RK had arrived the previous year. From here, the focus was on expanding the physics and applications: ray-tracing terms like “hit density” and “capture probability” were replaced with real-world quantities such as pressure and pumping speed. To publish the code within the group, a website was created with downloads, tutorial videos and a user forum. Later that year, a sister code “Synrad” for synchrotron-radiation calculations, also written in Trieste in the 1990s, was ported to the modern environment. The two codes could, for the first time, be used as a package: first, a synchrotron-radiation simulation could determine where light hits a vacuum chamber, then the results could be imported to a subsequent vacuum simulation to trace the gas desorbed from the chamber walls. This is the so-called photon-stimulated desorption effect, which is a major hindrance to many accelerators, including the LHC.

Molflow and Synrad have been downloaded more than 1000 times in the past year alone, and anonymous user metrics hint at around 500 users who launch it at least once per month. The code is used by far the most in China, followed by the US, Germany and Japan. Switzerland, including users at CERN, places only fifth. Since 2018, the roughly 35,000-line code has been available open-source and, although originally written for Windows, it is now available for other operating systems, including the new ARM-based Macs and several versions of Linux.

One year later, the Contamination Control Engineering (CCE) team from NASA’s Jet Propulsion Laboratory (JPL) in California reached out to CERN in the context of its three-stage Mars 2020 mission. The Mars 2020 Perseverance Rover, built to search for signs of ancient microbial life, successfully landed on the Martian surface in February 2021 and has collected and cached samples in sealed tubes. A second mission plans to retrieve the cache canister and launch it into Mars orbit, while a third would locate and capture the orbital sample and return it to Earth. Each spacecraft experiences and contributes to its own contamination environment through thruster operations, material outgassing and other processes. JPL’s CCE team performs the identification, quantification and mitigation of such contaminants, from the concept-generation to the end-of-mission phase. Key to this effort is the computational physics modelling of contaminant transport from materials outgassing, venting, leakage and thruster plume effects.

Contamination consists of two types: molecular (thin-film deposition effects) and particulate (producing obscuration, optical scatter, erosion or mechanical damage). Both can lead to degradation of optical properties and spurious chemical composition measurements. As more sensitive space missions are proposed and built – particularly those that aim to detect life – understanding and controlling outgassing properties requires novel approaches to operating thermal vacuum chambers. 

Just like accelerator components, most spacecraft hardware undergoes long-duration vacuum baking at relatively high temperatures to reduce outgassing. Outgassing rates are verified with quartz crystal microbalances (QCMs), rather than vacuum gauges as used at CERN. These probes measure the resonance frequency of oscillation, which is affected by the accumulation of adsorbed molecules, and are very sensitive: a 1 ng deposition on 1 cm2 of surface de-tunes the resonance frequency by 2 Hz. By performing free-molecular transport simulations in the vacuum-chamber test environment, measurements by the QCMs can be translated to outgassing rates of the sources, which are located some distance from the probes. For these calculations, JPL currently uses both Monte Carlo schemes (via Molflow) and “view factor matrix” calculations (through in-house solvers). During one successful Molflow application (see “Molflow in space” image, top) a vacuum chamber with a heated inner shroud was simulated, and optimisation of the chamber geometry resulted in a factor-40 increase of transmission to the QCMs over the baseline configuration. 

From SPHEREx to LISA

Another JPL project involving free molecular-flow simulations is the future near-infrared space observatory SPHEREx (Spectro-Photometer for the History of the Universe and Ices Explorer). This instrument has cryogenically cooled optical surfaces that may condense molecules in vacuum and are thus prone to significant performance degradation from the accumulation of contaminants, including water. Even when taking as much care as possible during the design and preparation of the systems, some elements, such as water, cannot be entirely removed from a spacecraft and will desorb from materials persistently. It is therefore vital to know where and how much contamination will accumulate. For SPHEREx, water outgassing, molecular transport and adsorption were modelled using Molflow against internal thermal predictions, enabling a decontamination strategy to keep its optics free from performance-degrading accumulation (see “Molflow in space” image, left). Molflow has also complemented other NASA JPL codes to estimate the return flux (whereby gas particles desorbing from a spacecraft return to it after collisions with a planet’s atmosphere) during a series of planned fly-bys around Jupiter’s moon Europa. For such exospheric sampling missions, it is important to distinguish the actual collected sample from return-flux contaminants that originated from the spacecraft but ended up being collected due to atmospheric rebounds.

Vacuum-chamber simulation for NASA

It is the ability to import large, complex geometries (through a triangulated file format called STL, used in 3D printing and supported by most CAD software) that makes Molflow usable for JPL’s molecular transport problems. In fact, the JPL team “boosted” our codes with external post-processing: instead of built-in visualisation, they parsed the output file format to extract pressure data on individual facets (polygons representing a surface cell), and sometimes even changed input parameters programmatically – once again working directly on Molflow’s own file format. They also made a few feature requests, such as adding histograms showing how many times molecules bounce before adsorption, or the total distance or time they travel before being adsorbed on the surfaces. These were straightforward to implement, and because JPL’s scientific interests also matched those of CERN users, such additions are now available for everyone in the public versions of the code. Similar requests have come from experiments employing short-lived radioactive beams, such as those generated at CERN’s ISOLDE beamlines. Last year, against all odds during COVID-related restrictions, the JPL team managed to visit CERN. While showing the team around the site and the chemistry laboratory, they held a seminar for our vacuum group about contamination control at JPL, and we showed the outlook for Molflow developments.

Our latest space-related collaboration, started in 2021, concerns the European Space Agency’s LISA mission, a future gravitational-wave interferometer in space (see CERN Courier September/October 2022 p51). Molflow is being used to analyse data from the recently completed LISA Pathfinder mission, which explored the feasibility of keeping two test masses in gravitational free-fall and using them as inertial sensors by measuring their motion with extreme precision. Because the satellite’s sides have different temperatures, and because the gas sources are asymmetric around the masses, there is a difference in outgassing between two sides. Moreover, the gas molecules that reach the test mass are slightly faster on one side than the other, resulting in a net force and torque acting on the mass, of the order of femtonewtons. When such precise inertial measurements are required, this phenomenon has to be quantified, along with other microscopic forces, such as Brownian noise resulting from the random bounces of molecules on the test mass. To this end, Molflow is currently being modified to add molecular force calculations for LISA, along with relevant physical quantities such as noise and resulting torque.

Sky’s the limit 

High-energy applications

Molflow has proven to be a versatile and effective computational physics model for the characterisation of free-molecular flow, having been adopted for use in space exploration and the aerospace sector. It promises to continue to intertwine different fields of science in unexpected ways. Thanks to the ever-growing gaming industry, which uses ray tracing to render photorealistic scenes of multiple light sources, consumer-grade graphics cards started supporting ray-tracing in 2019. Although intended for gaming, they are programmable for generic purposes, including science applications. Simulating on graphics-processing units is much faster than traditional CPUs, but it is also less precise: in the vacuum world, tiny imprecisions in the geometry can result in “leaks” or some simulated particles crossing internal walls. If this issue can be overcome, the speedup potential is huge. In-house testing carried out recently at CERN by PhD candidate Pascal Bahr demonstrated a speedup factor of up to 300 on entry-level Nvidia graphics cards, for example.

Our latest space-related collaboration concerns the European Space Agency’s LISA mission

Another planned Molflow feature is to include surface processes that change the simulation parameters dynamically. For example, some getter films gradually lose their pumping ability as they saturate with gas molecules. This saturation depends on the pumping speed itself, resulting in two parameters (pumping speed and molecular surface saturation) that depend on each other. The way around this is to perform the simulation in iterative time steps, which is straightforward to add but raises many numerical problems.

Finally, a much-requested feature is automation. The most recent versions of the code already allow scripting, that is, running batch jobs with physics parameters changed step-by-step between each execution. Extending these automation capabilities, and adding export formats that allow easier post-processing with common tools (Matlab, Excel and common Python libraries) would significantly increase usability. If adding GPU ray tracing and iterative simulations are successful, the resulting – much faster and more versatile – Molflow code will remain an important tool to predict and optimise the complex vacuum systems of future colliders.

The post Tracing molecules at the vacuum frontier appeared first on CERN Courier.

]]>
Feature The CERN-developed simulator “Molflow” has become the de-facto industry standard for ultra-high-vacuum simulations, with applications ranging from chip manufacturing to the exploration of the Martian surface. https://cerncourier.com/wp-content/uploads/2022/08/CCSepOct22_MOLFLOW_frontis.jpg
Exploring a laser-hybrid accelerator for radiotherapy https://cerncourier.com/a/exploring-a-laser-hybrid-accelerator-for-radiotherapy/ Mon, 05 Sep 2022 08:46:46 +0000 https://preview-courier.web.cern.ch/?p=105924 A multidisciplinary team in the UK has received seed funding to develop a conceptual design report for an advanced ion-therapy research facility.

The post Exploring a laser-hybrid accelerator for radiotherapy appeared first on CERN Courier.

]]>
LhARA

A multidisciplinary team in the UK has received seed funding to investigate the feasibility of a new facility for ion-therapy research based on novel accelerator, instrumentation and computing technologies. At the core of the facility would be a laser-hybrid accelerator dubbed LhARA: a high-power pulsed laser striking a thin foil target would create a large flux of protons or ions, which are captured using strong-focusing electron–plasma lenses and then accelerated rapidly in a fixed-field alternating-gradient accelerator. Such a device, says the team, offers enormous clinical potential by providing more flexible, compact and cost-effective multi-ion sources.

High-energy X-rays are by far the most common radiotherapy tool, but recent decades have seen a growth in particle-beam radiotherapy. In contrast to X-rays, protons and ion beams can be manipulated to deliver radiation doses more precisely than conventional radiotherapy, sparing surrounding healthy tissue. Unfortunately, the number of ion treatment facilities is few because they require large synchrotrons to accelerate the ions. The Proton-Ion Medical Machine Study undertaken at CERN during the late 1990s, for example, underpinned the CNAO (Italy) and MedAustron (Austria) treatment centres that helped propel Europe to the forefront of the field – work that is now being continued by CERN’s Next Ion Medical Machine Study (CERN Courier July/August 2021 p23).

“LhARA will greatly accelerate our understanding of how protons and ions interact and are effective in killing cancer cells, while simultaneously giving us experience in running a novel beam,” says LhARA biological science programme manager Jason Parsons of the University of Liverpool. “Together, the technology and the science will help us make a big step forward in optimising radiotherapy treatments for cancer patients.” 

A small number of laboratories in Europe already work on laser-driven sources for biomedical applications. The LhARA collaboration, which comprises physicists, biologists, clinicians and engineers, aims to build on this work to demonstrate the feasibility of capturing and manipulating the flux created in the laser-target interaction to provide a beam that can be accelerated rapidly to the desired energy. The laser-driven source offers the opportunity to capture intense, nanosecond-long pulses of protons and ions at an energy of 15 MeV, says the team. This is two orders of magnitude greater than in conventional sources, allowing the space-charge limit on the instantaneous dose to be evaded. 

In July, UK Research and Innovation granted £2 million over the next two years to deliver a conceptual design report for an Ion Therapy Research Facility (ITRF) centred around LhARA. The first goal is to demonstrate the feasibility of the laser-hybrid approach in a facility dedicated to biological research, after which the team will work with national and international partnerships to develop the clinical technique. While the programme carries significant technical risk, says LhARA co-spokesperson Kenneth Long from Imperial College London/STFC, it is justified by the high level of potential reward: “The multi­disciplinary approach of the LhARA collaboration will place the ITRF at the forefront of the field, partnering with industry to pave the way for significantly enhanced access to state-of-the-art particle-beam therapy.” 

The post Exploring a laser-hybrid accelerator for radiotherapy appeared first on CERN Courier.

]]>
News A multidisciplinary team in the UK has received seed funding to develop a conceptual design report for an advanced ion-therapy research facility. https://cerncourier.com/wp-content/uploads/2022/09/CCSepOct22_NA_LHARA_feature.jpg
First light beckons at SLAC’s LCLS-II https://cerncourier.com/a/first-light-beckons-at-slacs-lcls-ii/ Mon, 05 Sep 2022 08:40:59 +0000 https://preview-courier.web.cern.ch/?p=105785 An ambitious upgrade of the US flagship X-ray free-electron laser rests on sustained cooperation with high-energy physics labs in the US, Europe and Japan.

The post First light beckons at SLAC’s LCLS-II appeared first on CERN Courier.

]]>
The LCLS undulator hall

An ambitious upgrade of the US’s flagship X-ray free-electron-laser facility – the Linac Coherent Light Source (LCLS) at SLAC in California – is nearing completion. Set for “first light” early next year, LCLS-II will deliver X-ray laser beams that are 10,000 times brighter than LCLS at repetition rates of up to a million pulses per second – generating more X-ray pulses in just a few hours than the current laser has delivered through the course of its 12-year operational lifetime. The cutting-edge physics of the new facility – underpinned by a cryogenically cooled superconducting radio-frequency (SRF) linac – will enable the two beams from LCLS and LCLS-II to work in tandem. This, in turn, will help researchers observe rare events that happen during chemical reactions and study delicate biological molecules at the atomic scale in their natural environments, as well as potentially shed light on exotic quantum phenomena with applications in next-generation quantum computing and communications systems. 

Successful delivery of the LCLS-II linac was possible thanks to a multi-centre collaborative effort involving US national and university laboratories, following the decision to pursue an SRF-based machine in 2014 through the design, assembly, test, transportation and installation of a string of 37 SRF cryomodules (most of them more than 12 m long) into the SLAC tunnel. All told, this major undertaking necessitated the construction of forty 1.3 GHz SRF cryomodules (five of them spares) and three 3.9 GHz cryomodules (one spare) – with delivery of approximately one cryomodule per month from February 2019 until December 2020 to allow completion of the LCLS-II linac installation on schedule by November 2021. 

This industrial-scale programme of works was shaped by a strategic commitment, early on in the LCLS-II design phase, to transfer, and ultimately iterate, the established SRF capabilities of the European XFEL in Hamburg into the core technology platform used for the LCLS-II SRF cryomodules. Put simply: it would not have been possible to complete the LCLS-II project, within cost and on schedule, without the sustained cooperation of the European XFEL consortium – in particular, colleagues at DESY, CEA Saclay and several other European laboratories as well as KEK – that generously shared their experiences and know-how. 

Better together 

These days, large-scale accelerator or detector projects are very much a collective endeavour. Not only is the sprawling scope of such projects beyond a single organisation, but the risks of overspend and slippage can greatly increase with a “do-it-on-your-own” strategy. When the LCLS-II project opted for an SRF technology pathway in 2014 to maximise laser performance, the logical next step was to build a broad-based coalition with other US Department of Energy (DOE) national laboratories and universities. In this case, SLAC, Fermilab, Jefferson Lab (JLab) and Cornell University contributed expertise for cryomodule production, while Argonne National Laboratory and Lawrence Berkeley National Laboratory managed delivery of the undulators and photoinjector for the project. For sure, the start-up time for LCLS-II would have increased significantly without this joint effort, extending the overall project by several years.

Superconducting accelerator

Each partner brought something unique to the LCLS-II collaboration. While SLAC was still a relative newcomer to SRF technologies, the lab had a management team that was familiar with building large-scale accelerators (following successful delivery of the LCLS). The priority for SLAC was therefore to scale up its small nucleus of SRF experts by recruiting experienced SRF technologists and engineers to the staff team. In contrast, the JLab team brought an established track-record in the production of SRF cryomodules, having built its own machine, the Continuous Electron Beam Accelerator Facility (CEBAF), as well as cryomodules for the Spallation Neutron Source (SNS) linac at Oak Ridge National Laboratory in Tennessee. Cornell, too, came with a rich history in SRF R&D – capabilities that, in turn, helped to solidify the SRF cavity preparation process for LCLS-II. 

Finally, Fermilab had, at the time, recently built two cutting-edge cryomodules of the same style as that chosen for LCLS-II. To fabricate these modules, Fermilab worked closely with the team at DESY to set up the same type of production infrastructure used on the European XFEL. From that perspective, the required tooling and fixtures were all ready to go for the LCLS-II project. While Fermilab was the “designer of record” for the SRF cryomodule, with primary responsibility for delivering a working design to meet LCLS-II requirements, the realisation of an optimised technology platform was a team effort involving SRF experts from across the collaboration.

Collective problems, collective solutions 

While the European XFEL provided the template for the LCLS-II SRF cryomodule design, several key elements of the LCLS-II approach subsequently evolved to align with the continuous-wavelength (CW) operation requirements and the specifics of the SLAC tunnel. Success in tackling these technical challenges – across design, assembly, testing and transportation of the cryomodules – is testament to the strength of the LCLS-II collaboration and the collective efforts of the participating teams in the US and Europe.

Challenges are inevitable when developing new facilities at the limits of known technology

For one, the thermal performance specification of the SRF cavities exceeded the state-of-the-art and required development and industrialisation of the concept of nitrogen doping (a process in which SRF cavities are heat-treated in a nitrogen atmosphere to increase their cryogenic efficiency and, in turn, lower the overall operating costs of the linac). The nitrogen-doping technique was invented at Fermilab in 2012 but, prior to LCLS-II construction, had been used only in an R&D setting.

The priority was clear: to transfer the nitrogen-doping capability to LCLS-II’s industry partners, so that the cavity manufacturers could perform the necessary materials-processing before final helium-vessel jacketing. During this knowledge transfer, it was found that nitrogen-doped cavities are particularly sensitive to the base niobium sheet material – something the collaboration only realised once the cavity vendors were into full production. This resulted in a number of process changes for the heat treatment temperature, depending on which material supplier was used and the specific properties of the niobium sheet deployed in different production runs. JLab, for its part, held the contract for the cavities and pulled out all stops to ensure success.

SRF cryomodules

At the same time, the conversion from pulsed to CW operation necessitated a faster cooldown cycle for the SRF cavities, requiring several changes to the internal piping, a larger exhaust chimney on the helium vessel, as well as the addition of two new cryogenic valves per cryomodule. Also significant is the 0.5% slope in the longitudinal floor of the existing SLAC tunnel, which dictated careful attention to liquid-helium management in the cryomodules (with a separate two-phase line and liquid-level probes at both ends of every module). 

However, the biggest setback during LCLS-II construction involved the loss of beamline vacuum during cryomodule transport. Specifically, two cryomodules had their beamlines vented and required complete disassembly and rebuilding – resulting in a five-month moratorium on shipping of completed cryomodules in the second half of 2019. It turns out that a small (what was thought to be inconsequential) change in a coupler flange resulted in the cold coupler assembly being susceptible to resonances excited by transport. The result was a bellows tear that vented the beamline. Unfortunately, initial “road-tests” with a similar, though not exactly identical, prototype cryomodule had not revealed this behaviour. 

Such challenges are inevitable when developing new facilities at the limits of known technology. In the end, the problem was successfully addressed using the diverse talents of the collaboration to brainstorm solutions, with the available access ports allowing an elastomer wedge to be inserted to secure the vulnerable section. A key take-away here is the need for future projects to perform thorough transport analysis, verify the transport loads using mock-ups or dummy devices, and install adequate instrumentation to ensure granular data analysis before long-distance transport of mission-critical components. 

The last cryomodule from Fermilab

Upon completion of the assembly phase, all LCLS-II cryo­modules were subsequently tested at either Fermilab or JLab, with one module tested at both locations to ensure reproducibility and consistency of results. For high Q0 performance in nitrogen-doped cavities, cooldown flow rates of at least 30 g/s of liquid helium were found to give the best results, helping to expel magnetic flux that could otherwise be trapped in the cavity. Overall, cryomodule performance on the test stands exceeded specifications, with a total accelerating voltage per cryomodule of 158 MV (versus specification of 128 MV) and average Q0 of 3 × 1010 (versus specification of 2.7 × 1010). Looking ahead, attention is already shifting to the real-world cryomodule performance in the SLAC tunnel – something that was measured for the first time in 2022.

Transferable lessons

For all members of the collaboration working on the LCLS-II cryomodules, this challenging project holds many lessons. Most important is to build a strong team and use that strength to address problems in real-time as they arise. The mantra “we are all in this together” should be front-and-centre for any multi-institutional scientific endeavour – as it was in this case. Solutions need to be thought of in a more global sense, as the best answer might mean another collaborator taking more onto their plate. Collaboration implies true partnership and a working model very different to a transactional customer–vendor relationship.

From a planning perspective, it’s vital to ensure that the initial project cost and schedule are consistent with the technical challenges and preparedness of the infrastructure. Prototypes and pre-series production runs reduce risk and cost in the long term and should be part of the plan, but there must be sufficient time for data analysis and changes to be made after a prototype run in order for it to be useful. Time spent on detailed technical reviews is also time well spent. New designs of complex components need a comprehensive oversight and review, and should be controlled by a team, rather than a single individual, so that sign-off on any detailed design changes are made by an informed collective. 

LCLS-II science: capturing atoms and molecules in motion like never before

LCLS-II science

The strobe-like pulses of the LCLS, which produced its first light in April 2009, are just a few millionths of a billionth of a second long, and a billion times brighter than previous X-ray sources. This enables users from a wide range of fields to take crisp pictures of atomic motions, watch chemical reactions unfold, probe the properties of materials and explore fundamental processes in living things. LCLS-II will provide a major jump in capability – moving from 120 pulses per second to 1 million, enabling experiments that were previously impossible. The scientific community has identified six areas where the unique capabilities of LCLS-II will be essential for further scientific progress:

Nanoscale materials dynamics, heterogeneity and fluctuations 

Programmable trains of soft X-ray pulses at high rep rate will characterise spontaneous fluctuations and heterogeneities at the nanoscale across many decades, while coherent hard X-ray scattering will provide unprecedented spatial resolution of material structure, its evolution and relationship to functionality under operating conditions.

Fundamental energy and charge dynamics

High-repetition-rate soft X-rays will enable new techniques that will directly map charge distributions and reaction dynamics at the scale of molecules, while new nonlinear X-ray spectroscopies offer the potential to map quantum coherences in an element-specific way for the first time.

Catalysis and photocatalysis

Time-resolved, high-sensitivity, element- specific spectroscopy will provide the first direct view of charge dynamics and chemical processes at interfaces, characterise subtle conformational changes associated with charge accumulation, and capture rare chemical events in operating catalytic systems across multiple time and length scales – all of which are essential for designing new, more efficient systems for chemical transformation and solar-energy conversion.

Emergent phenomena in quantum materials

Fully coherent X-rays will enable new high- resolution spectroscopy techniques to map the collective excitations that define these new materials in unprecedented detail. Ultrashort X-ray pulses and optical fields will facilitate new methods for manipulating charge, spin and phonon modes to both advance fundamental understanding and point the way to new approaches for materials control.

Revealing biological function in real time

The high repetition rate of LCLS-II will provide a unique capability to follow the dynamics of macromolecules and interacting complexes in real time and in native environments. Advanced solution-scattering and coherent imaging techniques will characterise the conformational dynamics of heterogeneous ensembles of macromolecules, while the ability to generate “two-colour” hard X-ray pulses will resolve atomic-scale structural dynamics of biochemical processes that are often the first step leading to larger-scale protein motions.

Matter in extreme environments

The capability of LCLS-II to generate soft and hard X-ray pulses simultaneously will enable the creation and observation of extreme conditions that are far beyond our present reach, with the latter allowing the characterisation of unknown structural phases. Unprecedented spatial and temporal resolution will enable direct comparison with theoretical models relevant for inertial-confinement fusion and planetary science.

Work planning and control is another essential element for success and safety. This idea needs to be built into the “manufacturing system”, including into the cost and schedule, and to be part of each individual’s daily checklist. No one disagrees with this concept, but good intentions on their own will not suffice. As such, required safety documentation should be clear and unambiguous, and be reviewed by people with relevant expertise. Production data and documentation need to be collected, made easily available to the entire project team, and analysed regularly for trends, both positive and negative. 

Supply chain, of course, is critical in any production environment – and LCLS-II is no exception. When possible, it is best to have parts procured, inspected, accepted and on-the-shelf before production begins, thereby eliminating possible workflow delays. Pre-stocking also allows adequate time to recycle and replace parts that do not meet project specifications. Also worth noting is that it’s often the smaller components – such as bellows, feedthroughs and copper-plated elements – that drive workflow slowdowns. A key insight from LCLS-II is to place purchase orders early, stay on top of vendor deliveries, and perform parts inspections as soon as possible post-delivery. Projects also benefit from having clearly articulated pass/fail criteria and established procedures for handling non-conformance – all of which alleviates the need to make critical go/no-go acceptance decisions in the face of schedule pressures.

As with many accelerator projects, LCLS-II is not an end-point in itself, more an evolutionary transition within a longer term roadmap

Finally, it’s worth highlighting the broader impact – both personal and professional – on individual team members participating in a big-science collaboration like LCLS-II. At the end of the build, what remained after designs were completed, problems solved, production rates met, and cryomodules delivered and installed, were the friendships that had been nurtured over several years. The collaboration amongst partners, both formal and informal, who truly cared about the project’s success, and had each other’s backs when there were issues arising: these are the things that solidified the mutual respect, the camaraderie and, in the end, made LCLS-II such a rewarding project.

First light

In April 2022 the new LCLS-II linac was successfully cooled to its 2 K operating temperature. The next step was to pump the SRF cavities with more than a megawatt of microwave power to accelerate the electron beam from the new source. Following further commissioning of the machine, first X-rays are expected to be produced in early 2023. 

As with many accelerator projects, LCLS-II is not an end-point in itself, more an evolutionary transition within a longer term roadmap. In fact, work is already under way on LCLS-II HE – a project that will increase the energy of the CW SRF linac from 4 to 8 GeV, enabling the photon energy range to be extended to at least 13 keV, and potentially up to 20 keV at 1 MHz repetition rates. To ensure continuity of production for LCLS-II HE, 25 next-generation cryomodules are in the works, with even higher performance specifications versus their LCLS-II counterparts, while upgrades to the source and beam transport are also being finalised. 

While the fascinating science opportunities for LCLS-II-HE continue to be refined and expanded, of one thing we can be certain: strong collaboration and the collective efforts of the participating teams are crucial. 

The post First light beckons at SLAC’s LCLS-II appeared first on CERN Courier.

]]>
Feature An ambitious upgrade of the US flagship X-ray free-electron laser rests on sustained cooperation with high-energy physics labs in the US, Europe and Japan. https://cerncourier.com/wp-content/uploads/2022/08/CCSepOct22_LCLS_frontis.jpg
Fostering cross-disciplinarity https://cerncourier.com/a/fostering-cross-disciplinarity/ Thu, 14 Jul 2022 16:38:59 +0000 https://preview-courier.web.cern.ch/?p=102037 The 6th edition of the International Summer School on Intelligent Signal Processing for Frontier Research and Industry covered a diverse range of applications.

The post Fostering cross-disciplinarity appeared first on CERN Courier.

]]>
Irradiating biomaterial

Despite several COVID waves, the organisers of the 6th edition of the International Summer School on Intelligent Signal Processing for Frontier Research and Industry (INFIERI) made this school an in-person event. The INFIERI school was successfully held at UAM from August 23 to September 4 thanks to the unprecedented speed of the vaccine roll out, the responsible behaviour of the school participants and the proper applied logistics.

Against a backdrop of topics ranging from cosmology to the human body and particle physics, the programme covered advanced technologies such as semiconductors, deep sub-micron 3D technologies, data transmission, artificial intelligence and quantum computing.

Topics were presented in lectures and keynote speeches, and the teaching was reinforced via hands-on laboratory sessions, allowing students to practise applications in realistic conditions across a range of areas, such as: theoretical physics, accelerators, quantum communication, Si Photonics and nanotechnology. The latter included medical applications to new mRNA vaccines, which have long been under investigation for cancer treatment, besides their use against COVID-19. For instance, they could analyse combined real PET/MRI images using machine-learning techniques to find biomarkers of illness in a hospital setting, or study the irradiation of a biomaterial using a proton beam. Worldwide experts from academia, industry and laboratories such as CERN either gave lectures or ran lab sessions, most of them attending in person, often for the entire duration of the school.

During the last day, the students presented posters on their own research projects – the high number and quality of presentations reflecting the cross-disciplinary facets and the excellence of the participants. Many were then selected to be part of the in-preparation proceedings of the Journal of Instrumentation.

The next INFIERI school will only offer in-person attendance, which is considered essential to the series, but if the pandemic continues it will exploit some of the learning gained from the 6th edition.

The post Fostering cross-disciplinarity appeared first on CERN Courier.

]]>
Meeting report The 6th edition of the International Summer School on Intelligent Signal Processing for Frontier Research and Industry covered a diverse range of applications. https://cerncourier.com/wp-content/uploads/2022/06/CCJulAug22_FN-INFINEIRI.jpg
Accelerating aerosol production https://cerncourier.com/a/accelerating-aerosol-production/ Fri, 01 Jul 2022 12:27:19 +0000 https://preview-courier.web.cern.ch/?p=101981 CLOUD's discovery of a new mechanism accelerating the formation of aerosol particles in the upper troposphere has potential implications for air-pollution regulations.

The post Accelerating aerosol production appeared first on CERN Courier.

]]>
A simulation of aerosol-particle formation

The CLOUD collaboration at CERN has uncovered a new mechanism accelerating the formation of aerosol particles in the upper troposphere, with potential implications for air-pollution regulations. The results, published in Nature on 18 May, show that an unexpected synergy between nitric acid, sulphuric acid and ammonia leads to the formation of aerosols at significantly faster rates than those from any two of the three components. The mechanism may represent a major source of cloud and ice seed particles in certain regions of the globe, says the team.

Aerosol particles are known to generally cool the climate by reflecting sunlight back into space and by seeding cloud droplets. But the vapours driving their formation are not well understood. The CLOUD (Cosmics Leaving Outdoor Droplets) facility at CERN’s East Area replicates the atmosphere in an ultraclean chamber to study, under precisely-controlled atmospheric conditions, the formation of aerosol particles from trace vapours and how they grow to become the seeds for clouds.

Three is key

Building on earlier findings that ammonia and nitric acid can accelerate the growth rates of newly formed particles, the CLOUD team introduced mixtures of sulphuric acid, nitric acid and ammonia vapours to the chamber and observed the rates at which particles formed. They found that the three vapours together form new particles 10–1000 times faster than a sulphuric acid–ammonia mixture, which previous CLOUD measurements suggested was the dominant source of upper tropospheric particles. Once the three-component particles form, they grow rapidly from the condensation of nitric acid and ammonia alone to sizes where they seed clouds. 

Moreover, the team found these particles to be highly efficient at seeding ice crystals, comparable to desert dust particles, which are thought to be the most widespread and effective ice seeds in the atmosphere. When a supercooled cloud droplet freezes, the resulting ice particle will grow at the expense of any unfrozen droplets nearby, making ice a major factor in the microphysical properties of clouds and precipitation. Around three-quarters of global precipitation is estimated to originate from ice particles.

Feeding their measurements into global aerosol models that include vertical transport of ammonia by deep convective clouds, the CLOUD researchers found that although the particles form locally in ammonia-rich regions of the upper troposphere, such as over the Asian monsoon regions, they travel from Asia to North America in just three days via the subtropical jet stream, potentially influencing Earth’s climate on an intercontinental scale (see “Enhancement” figure). The importance of the new synergistic mechanism depends on the availability of ammonia in the upper troposphere, which originates mainly from livestock and fertiliser emissions. Atmospheric concentrations of all three compounds are much higher today than in the pre-industrial era.

“Our results will improve the reliability of global climate models in accounting for aerosol formation in the upper troposphere and in predicting how the climate will change in the future,” says CLOUD spokesperson Jasper Kirkby. “Once again, CLOUD is finding that anthropogenic ammonia has a major influence on atmospheric aerosol particles, and our studies are informing policies for future air-pollution regulations.”

Our results will improve the reliability of global climate models

Working at the intersection between atmospheric science and particle physics, CLOUD has published several important results since it started operations in 2009. These include new mechanisms responsible for driving winter smog episodes in cities and for potentially accelerating the loss of Arctic sea ice, in addition to studies of the impact of cosmic rays on clouds and climate (CERN Courier July/August 2020 p48). 

“When CLOUD started operation, the prevailing understanding was that sulphuric acid vapour alone could account for almost all observations of new-particle formation in the atmosphere,” says Kirkby. “Our first experiments showed that it was around one million times too slow, and CLOUD went on to discover that additional vapours – especially biogenic vapours from trees – form particles together with stabilisers like ammonia, amines or ions from cosmic rays. CLOUD has now established a mechanistic understanding of aerosol particle formation for global climate models – but our work isn’t finished yet.”

The post Accelerating aerosol production appeared first on CERN Courier.

]]>
News CLOUD's discovery of a new mechanism accelerating the formation of aerosol particles in the upper troposphere has potential implications for air-pollution regulations. https://cerncourier.com/wp-content/uploads/2022/06/CCJulAug22_NA_enhancement.jpg
Toward a diffraction limited storage-ring-based X-ray source https://cerncourier.com/a/toward-a-diffraction-limited-storage-ring-based-x-ray-source/ Mon, 04 Apr 2022 12:27:33 +0000 https://preview-courier.web.cern.ch/?p=98060 This webinar is available to watch now, presented by SLAC accelerator physicist Pantaleo Raimondi.

The post Toward a diffraction limited storage-ring-based X-ray source appeared first on CERN Courier.

]]>
By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

Multi-bend achromat (MBA) lattices have initiated a fourth generation for storage-ring light sources with orders of magnitude increase in brightness and transverse coherence. A few MBA rings have been built, and many others are in design or construction worldwide, including upgrades of APS and ALS in the US.

The HMBA (hybrid MBA), developed for the successful ESRF–EBS MBA upgrade has proven to be very effective in addressing the nonlinear dynamics challenges associated with pushing the emittance toward the diffraction limit. The evolution of the HMBA ring designs will be described in this seminar. The new designs are consistent with the breaking of the lattice periodicity found in traditional circular light sources, inserting dedicated sections for efficient injection and additional emittance damping.

Techniques developed for high-energy physics rings to mitigate nonlinear dynamics challenges associated with breaking periodicity at collision points were applied in the HMBA designs for the injection and damping sections. These techniques were also used to optimise the individual HMBA cell nonlinear dynamics. The resulting HMBA can deliver the long-sought diffraction limited source while maintaining the temporal and transverse stability of third-generation light sources due to the long lifetime and traditional off-axis injection enabled by nonlinear dynamics optimisation, thus improving upon the performance of rings now under construction.

Want to learn more on this subject?

Pantaleo Raimondi, professor at the Stanford Linear Accelerator Center, research technical manager, SLAC National Accelerator Laboratory and previously director, Accelerator and Source Division, ESRF.

 

 


 

The post Toward a diffraction limited storage-ring-based X-ray source appeared first on CERN Courier.

]]>
Webinar This webinar is available to watch now, presented by SLAC accelerator physicist Pantaleo Raimondi. https://cerncourier.com/wp-content/uploads/2022/03/2022-04-20-webinar-image.jpeg
Crab cavities enter next phase https://cerncourier.com/a/crab-cavities-enter-next-phase/ Wed, 09 Mar 2022 14:28:55 +0000 https://preview-courier.web.cern.ch/?p=97851 Rama Calaga describes the latest progress in building the superconducting radio-frequency “crab” cavities that will increase the probability of collisions at the High-Luminosity LHC.

The post Crab cavities enter next phase appeared first on CERN Courier.

]]>
The imminent start of LHC Run 3 following a vast programme of works completed during Long Shutdown 2 marks a milestone for the CERN accelerator complex. When stable proton beams return to the LHC this year (see LHC Run 3: the final countdown), they will collide at higher energies (13.6 compared to 13 TeV) and with higher luminosities (containing up to 1.8 × 1011 protons per bunch compared to 1.3–1.4 × 1011) than in Run 2. Physicists working on the LHC experiments can therefore look forward to a rich harvest of results during the next three years. After Run 3, the statistical gain in running the accelerator without a significant luminosity increase beyond its design and ultimate values will become marginal. Therefore, to maintain scientific progress and to exploit its full capacity, the LHC is undergoing upgrades that will allow a decisive increase of its luminosity during Run 4, expected to begin in 2029, and beyond.

Several technologies are being developed for this High-Luminosity LHC (HL-LHC) upgrade. One is new, large-aperture quadrupole magnets based on a niobium-tin superconductor. These will be installed on either side of the ATLAS and CMS experiments, providing the space required for smaller beam-spot sizes at the interaction points and shielding against the higher radiation levels when operating at increased luminosities. The other key technology, necessary to take advantage of the smaller beam-spot size at the interaction points, is a series of superconducting radio-frequency (RF) “crab” cavities that enlarge the overlap area of the incoming bunches and thus increase the probability of collisions. Never used before at a hadron collider, a total of 16 compact crab cavities will be installed on either side of each of ATLAS and CMS once Run 3 ends and Long Shutdown 3 begins.

The crab-cavity test facility

At a collider such as the LHC, it is imperative that the two counter-circulating beams are physically separated by an angle, aka the crossing angle, such that bunches collide only in one single location over the common interaction region (where the two beams share the same beam pipe). The bunches at the HL-LHC will be 10 cm long and only 7 μm wide at the collision points, resembling thin long wires. As a result, even a very small angle between the bunches implies an immediate loss in luminosity. With the use of powerful superconducting crab cavities, the tilt of the bunches at the collision point can be precisely controlled to make it optimal for the experiments and fully exploit the scientific potential of the HL-LHC.

Radical concepts 

The tight space constraints from the relatively small separation of the two beams outside the common interaction region requires a radically new RF concept for particle deflection, employing a novel shape and significantly smaller cavities than those used in other accelerators. Designs for such devices began around 10 years ago, with CERN settling on two types: double quarter wave (DQW) and RF-dipole (RFD). The former will be fitted around CMS, where bunches are separated vertically, and the latter around ATLAS, where bunches will be separated horizontally, requiring crab cavities uniquely designed for each plane. It is also planned to swap the crossing-angle planes and crab-cavity installations at a later stage during the HL-LHC operation.

The RF-dipole cavity

In 2017, two prototype DQW-type cavities were built and assembled at CERN into a special cryomodule and tested at 2 K, validating the mechanical, cryogenic and RF functioning. The module was then installed in the Super Proton Synchrotron (SPS) for beam tests, with the world’s first “crabbing” of a proton beam demonstrated on 31 May 2018. In parallel, the fabrication of two prototype RFD-type cavities from high-purity niobium was underway at CERN. Following the integration of the devices into a titanium helium tank at the beginning of 2021, and successful tests at 2 K reaching voltages well beyond the nominal value of 3.4 MV, the cavities were equipped with specially designed RF couplers, which are necessary for beam operations. The two cavities are now being integrated into a cryomodule at Daresbury Laboratory in the UK as a joint effort between CERN and the UK’s Science and Technology Facilities Council (STFC). The cryomodule will be installed in a 15 m-long straight section (LSS6) of the SPS in 2023 for its first test with proton beams. This location in the SPS is equipped with a special by-pass and other services, which were put in place in 2017–2018 to test and operate the DQW-type module. 

The manufacturing challenge 

Due to the complex shape and micrometric tolerances required for the HL-LHC crab cavities, a detailed study was performed to realise the final shape through forming, machining, welding and brazing operations on the high-purity niobium sheets and associated materials (see “Fine machining” images). To ensure a uniform removal of material along the cavities’ complex shape, a rotational buffer chemical polishing (BCP) facility was built at CERN for surface etching of the HL-LHC crab cavities. For the RFD and DQW, the rotational setup etches approximately 250 μm of the internal RF surface to remove the damaged cortical layer during the forming process. Ultrasound measurements were performed to follow the evolution of the cavity-wall thickness during the BCP steps, showing remarkable uniformity (see “Chemical etching” images).

The chemical-etching setup

Preparation of the RFD cavities involved a similar process as that for the DQW modules. Following chemical etching and a very high-temperature bake at 650 °C in a vacuum furnace, the cavities are rinsed in ultra-pure water at high pressure (100 bar) for approximately seven hours. This process has proven to be a key step in the HL-LHC crab-cavity preparation to enable extremely high fields and suppress electron-field emitters, which can limit the performance. The cavity is then closed with its RF ancillaries in an ISO4 cleanroom environment to preserve the ultra-clean RF surface, and installed into a special vertical cryostat to cool the cavity surface to its 2 K operating temperature (see “Clean and cool” image, top). Both RFD cavities reached performances well above the nominal target of 3.4 MV. RFD1 reached more than 50% over the nominal voltage and RFD2 reached above a factor of two (7 MV) – a world-record deflecting field in this frequency range. These performances were reproducible after the assembly and welding of the helium tank owing to the careful preparation of the RF surface throughout the different steps of assembly and preparation. 

RF dipole cavity and cold magnetic shield

The helium tank provides a volume around the cavity surface that is maintained at 2 K with superfluid helium (see “Clean and cool” image, bottom). Due to sizeable deformations during the cool-down process from ambient temperature, a titanium vessel which has a thermal behaviour close to that of the niobium cavity is used. A magnetic shield between the cavity and the helium tank suppresses stray fields in the operating environment and further preserves cavity performance. Following the tests with helium tanks, the cavities were equipped with higher-order-mode couplers and field antennae to undergo a final test at 2 K before cryostating them into a two-cavity string.

The crab cavities require many ancillary components to allow them to function. This overall system is known as a cryomodule (see “Cryomodule” image, top) and ensures that the operational environment is correct, including the temperature, stability, vacuum conditions and RF frequency of the cavities. Technical challenges arise due to the need to assemble the cavity string in an ISO4 cleanroom, the space constraints of the LHC (leading to the rectangular compact shape), and the requirement of fully welded joints (where typically “O” rings would be used for the insulation vacuum).

Design components

The outer vacuum chamber (OVC) of the cryomodule provides an insulation vacuum to prevent heat leaking to the environment as well as providing interfaces to any external connections. Manufactured by ALCA Technology in Italy, the OVC used a rectangular design where the cavity string is mounted to a top-plate that is lowered into the rest of the OVC, and includes four large windows to allow access for repair in situ if required (see “Cryomodule” image, bottom). Since the first DQW prototype module, several cryomodule interfaces including cryogenic and vacuum components were updated to be fully compatible with the final installation in the HL-LHC. 

The HL-LHC crab-cavity programme has developed into a mature project supported by a large number of collaborating institutions around the world

Since superconducting RF cavities can have a higher surface resistance if cooled below their transition temperature in the presence of a magnetic field, they need to be shielded from Earth’s magnetic field and stray fields in the surrounding environment. This is achieved using a warm magnetic shield manufactured in the OVC, and a cold magnetic shield mounted inside the liquid-helium vessel. Both shields, which are made from special nickel-iron alloys, are manufactured by Magnetic Shields Ltd in the UK.

Status and outlook

The RFD crab-cavity pre-series cryomodule will be assembled this year at Daresbury lab, where the infrastructure on site has been upgraded, including an extension to the ISO4 cleanroom area and the introduction of an ISO6 preparation area. A bespoke five-tonne crane has also been installed and commissioned to allow the precise lowering of the delicate cavity string into the outer vacuum vessel.

RF dipole cryomodule and outer vacuum vessel

Parallel activities are taking place elsewhere. The HL-LHC crab-cavity programme has developed into a mature project supported by a large number of collaborating institutions around the world. In the US, the Department of Energy is supporting the HL-LHC Accelerator Upgrade Project to coordinate the efforts and leverage the expertise of a group of US laboratories and universities (FNAL, BNL, JLAB, SLAC, ODU) to deliver the series RFD cavities for the HL-LHC. In 2021, two RFD prototype cavities were built by the US collaboration and exceeded the two most important functional project requirements for crab cavities – deflecting voltage and quality factor. After this successful demonstration, the fabrication of the pre-series cavities was launched.

Crab cavities were first implemented in an accelerator in 2006, at the KEKB electron–positron collider in Japan, where they helped the collider reach record luminosities. A different “crab-waist” scheme is currently employed at KEKB’s successor, SuperKEKB, helping to reach even higher luminosities. The development of ultra-compact, very-high-field cavities for a high-energy hadron collider such as the HL-LHC is even more challenging, and will be essential to maximise the scientific output of this flagship facility beyond the 2030s. 

Beyond the HL-LHC, the compact crab-cavity concepts have been adopted by future facilities, including the proton–proton stage of the proposed Future Circular Collider; the Electron–Ion Collider under construction at Brookhaven; bunch compression in synchrotron X-ray sources to produce shorter pulses; and ultrafast particle separators in proton linacs to separate bunches of secondary particles for different experiments. The full implementation of this technology at the HL-LHC is therefore keenly awaited. 

The post Crab cavities enter next phase appeared first on CERN Courier.

]]>
Feature Rama Calaga describes the latest progress in building the superconducting radio-frequency “crab” cavities that will increase the probability of collisions at the High-Luminosity LHC. https://cerncourier.com/wp-content/uploads/2022/03/CCMarApr22_CRAB-frontis.jpg
The LS2 vacuum challenge https://cerncourier.com/a/the-ls2-vacuum-challenge/ Wed, 09 Mar 2022 08:25:20 +0000 https://preview-courier.web.cern.ch/?p=97877 An exhaustive programme of work spanning mechanical repairs and upgrades prepares the CERN vacuum systems for more luminous beam operations ahead.

The post The LS2 vacuum challenge appeared first on CERN Courier.

]]>
Carbon coating of a beam screen

The second long-shutdown of the CERN accelerator complex (LS2) is complete. After three years of intense works at all levels across the accelerators and experiments, beams are expected in the LHC in April. For the accelerators, the main LS2 priorities were the consolidation of essential safety elements (dipole diodes) for the LHC magnets, several interventions for the High-Luminosity LHC (HL-LHC) and associated upgrades of the injection chain via the LHC Injectors Upgrade project. Contributing to the achievement of these and many other planned parallel activities, the CERN vacuum team has completed an intense period of work in the tunnels, workshops and laboratories. 

Particle beams require extremely low pressure in the pipes in which they travel to ensure that their lifetime is not limited by interactions with residual gas molecules and to minimise backgrounds in the physics detectors. During LS2, all of the LHC’s arcs were vented to the air after warm-up to room temperature and all welds were leak-checked after the diode consolidation (with only one leak found among the 1796 tests performed). The vacuum team also replaced or consolidated around 150 turbomolecular pumps acting on the cryogenic insulation vacuum. In total, 2.4 km of non-evaporable-getter (NEG)-coated beampipes were also opened to the air at room temperature – an exhaustive programme of work spanning mechanical repair and upgrade (across 120 weeks), bake-out (90 weeks) and NEG activation (45 weeks). The vacuum level in these beampipes is now in the required range, with most of the pressure readings below 10–10 mbar. 

CMS beampipe and RF box installation

The vacuum control system was also significantly improved by reducing single points of failure, removing confusing architectures and, for the first time, using mobile vacuum equipment controlled and monitored wirelessly. In view of the higher LHC luminosity and the consequent higher radioactivity dose during Run 3 and beyond, the vacuum group has developed and installed new radiation-tolerant electronics controlling 100 vacuum gauges and valves in the LHC dispersion suppressors. This was the first step of a larger campaign to be implemented in the next long-shutdown, including the production of 1000 similar electronics cards for vacuum monitoring. In parallel, the control software was renewed. This included the introduction of resilient, scalable and self-healing web-based frameworks used by the biggest names in industry.

In the LHC experimental areas, the disassembling of the vacuum chambers at the beginning of LS2 required 93 interventions and 550 person-hours of work in the caverns, with the most impressive change in vacuum hardware implemented in CMS and LHCb (see “Interaction points” images). In CMS, a new 7.3 m-long beryllium beam-pipe with an internal diameter of 43.4 mm was installed and 12 new aluminium chambers were manufactured, surface-finished and NEG-coated at CERN. The mechanical installation, including alignments, pump-down and leak detection, took two months, while the bake-out and venting with ultra-pure neon required a further month. In LHCb, the vacuum team contributed to the new Vertex Locator (VELO). Its “RF box” – a delicate piece of equipment filled with silicon detectors, electronics and cooling circuits designed to protect the VELO without affecting the beams – is situated just a few mm from the beam with an aluminium window thinned down to 150 μm by chemical etching and then NEG-coated. As the VELO encloses the RF box and both volumes are under separated vacua, the pump-down is a critical operation because pressure differences across the thin window must be lower than 10 mbar to ensure mechanical integrity. The last planned activity for the vacuum team in LS2, the bake-out of the ATLAS beam pipes, took place in February.

Vacuum challenges 

From the list of successful achievements, it could be assumed that vacuum activities in LS2 have gone smoothly, with the team applying well known procedures and practicing knowledge accumulated over decades. However, as might be expected when working with several teams in parallel and at the limits of technology, with around 100 km of piping under vacuum for the LHC alone, this is far from the case. Since the beginning of LS2, CERN vacuum experts have experienced several technical issues and obstacles, a few of which deserve a mention (see “Overcoming the LS2 vacuum obstacles” panel). All these headaches have challenged our regular way of working and allowed us to reflect on procedures, communication and reporting, and technical choices. 

SPS vacuum team

But the real moment of truth is still yet to come, when the intensity of the LHC beams reaches the new nominal value boosted by the upgraded injectors. Under the spotlight will be surface electron emission, which drives the formation of electron clouds and their consequences, including beam instabilities and heat load on the cryogenic system. The latter showed anomalously high values during Run 2, with strong inhomogeneity along the ring indicating an uneven surface conditioning. The question is what will happen to the heat load during Run 3? Thanks to the effort and achievements of a dedicated taskforce, the scrubbing and following physics runs will provide a detailed answer in a few months. Last year, the task force installed additional instrumentation in the cryogenic lines in selected positions and, after many months of detective work, identified the most probable culprit of the puzzling heat-load values: the formation of a non-native copper oxide layer during electron bombardment of hydroxylated copper surfaces at cryogenic temperatures. UV exposure in selected gas, local bakeout and plasma etching are among the mitigation techniques we are going to investigate.

The HL-LHC horizon 

LS2 might only just have finished but we are already thinking about LS3 (2026–2028), whose leitmotif will be the finalisation of the HL-LHC project. Thanks to more focused beams at the collision points and an increased proton bunch population, the higher beam luminosity at CMS and ATLAS (peaking at a levelled value of 5 × 1034cm–2s–1) will enable an integrated luminosity of 3000 fb–1 in 12 years. For the HL-LHC vacuum systems, this requires a completely new design of the beam screens in the focusing area of the experiments, the implementation of carbon thin-film coatings in the unbaked beampipes to cope with the lower secondary electron yield threshold, and radiation-compatible equipment near the experiments and radiation-tolerant electronics down to the dispersion suppressor zones. 

Overcoming the LS2 vacuum obstacles

Unexpected interventions

Forgotten sponge 

During the first beam-commissioning of the PS, anomalous high proton losses were detected, generating pressure spikes and a high radioactive dose near one of the magnets. An endoscopic inspection (see image above, left) revealed the presence of an orange sponge that had been used to protect the vacuum chamber extremities before welding (and which had been left behind due to a miscommunication between the teams involved), blocking the lower half of the beam pipe. After days of investigation with the beams and interventions by technicians, the chamber was cut open and the offending object removed. 

Leaky junctions 

Having passed all tests before they were installed, new corrugated thin-walled vacuum chambers installed in the Proton Synchrotron Booster to reduce eddy-current effects suffered vacuum leaks after a few days of magnet pulsing. The leaks appeared in lip-welded junctions in several chambers, indicating a systematic production issue. Additional spare chambers were produced and, as the leaks remain tolerable, a replacement is planned during the next year-end technical stop. Until then, this issue will be the Sword of Damocles on the heads of the vacuum teams in charge of the LHC’s injectors.

Powering mismatch 

During the first magnet tests of the TT2 transfer line, a vacuum sector was suddenly air-vented. The support of the vacuum chambers was found to be broken; two bellows were destroyed (see image, middle), and the vacuum chamber twisted. The origin of the problem was a different powering scheme of the magnet embedding the chamber: faster magnetic pulses generated higher eddy-current and Lorentz forces that were incompatible with the beampipe design and supports. It was solved by inserting a thin insulation layer between vacuum flanges to interrupt the eddy current, a practice common in other parts of the injectors. 

QRL quirks 

The LHC’s helium transfer lines (QRL) require regular checks, especially after warm-up and cool-down. During LS2, the vacuum team installed two additional turbomolecular pumps to compensate for the rate increase of a known leak in sector B12, allowing operation until at least the next long-shutdown. Another troubling leak which opened only for helium pressures above 7 bar was detected in a beam-screen cooling circuit. Fixing it would have required the replacement of the nearby magnet but the leak turned out to be tolerable at cryogenic temperatures, although its on/off behaviour remains to be fully elucidated. 

Damaged disks 

Installed following the incident in sector 3–4 shortly after LHC startup, the beam vacuum in the LHC arcs is protected against overpressure by 832 “burst disks”. A 30 μm-thick stainless- steel disk membrane nominally breaks when the pressure in the vacuum system is 0.5 bar higher than the tunnel air pressure. Despite the careful venting procedure, 19 disks were either broken or damaged before the re-pumping of the arcs. Subsequent lab tests showed no damage in spare disks cycled 30 times at 1.1 bar. The vacuum teams replaced the damaged disks and are trying to understand the cause. 

Buckled fingers 

Before cool-down, a 34 mm-diameter ball fitted with a 40 MHz transmitter is pushed through the LHC beam pipes to check for obstacles. The typical defect is a buckling of the RF fingers in the plug-in modules (PIMs) that maintain electrical continuity as the machine thermally contracts. Unfortunately, in two cases the ball arrived damaged, and it took days to collect and identify all the broken pieces. A buckled finger was successfully found in sector 8-1, but another in sector 2-3 (see image, right) was revealed only when the pilot beam circulated. This forced a re-warming of the arc, venting of the beampipe and the replacement of the damaged PIM, followed by additional re-cooling and aperture and electrical tests. 

The first piece of vacuum equipment concerned is the “VAX”: a compact set of components, pumps, valves and gauges installed in an area of limited access and relatively high radioactivity between the last focusing magnet of the accelerator and the high-luminosity experiments. The VAX module is designed to be fully compatible with robot intervention, enabling leak detection, gasket change and complete removal of parts to be carried out remotely and safely. 

Despite the massive shielding between the experiment caverns and the accelerator tunnels, secondary particles from high-energy proton collisions can reach accelerator components outside the detector area. At nominal HL-LHC luminosity, up to 3.8 kW of power will be deposited in the tunnel on each side of CMS and ATLAS, of which 1.2 kW is intercepted by the 60 m-long sequence of final focusing magnets. Such a power is incompatible with magnet cooling at 1.9 K and, in the long run, could cause the insulation of the superconducting cables to deteriorate. To avoid this issue, the vacuum team designed a new beam screen equipped with tungsten-alloy shielding so that at least half of the power is captured before being transmitted to the magnet cold mass. 

All eyes are on the successful restart of the CERN accelerator complex and the beginning of LHC Run 3

The new HL-LHC beam screens took several years of design and manufacturing optimisation, multi-physics simulations and tests with prototypes. The most intense study concerned the mechanical integrity of this complicated object when the hosting magnet undergoes a quench, causing the current to drop from nearly 20 kA to 0 kA in a few tenths of a second. The manufacturing learning phase is now complete and the beam-screen facility will be ready this year, including the new laser-welding robot and cryogenic test benches. Carbon coating is the additional novelty of the HL-LHC beam screens, with the purpose of suppressing electron clouds (see “Beam screen” image). At the beginning of LS2 the first beam screens were successfully coated in situ, involving a small robot carrying carbon and titanium targets, and magnets for plasma confinement during deposition. 

The vacuum team is also involved in the production of crab cavities, another breakthrough brought by the HL-LHC project. The surfaces of these complex-shaped niobium objects are treated by a dedicated machine that can provide rotation while chemically polishing with a mixture of nitric, hydrofluoric and phosphoric acids. The vacuum system of the cryomodules in which the cavities are cooled at 2 K was also designed at CERN. 

Outlook

Vacuum technology for particle accelerators has been pioneered by CERN since its early days, with the Intersecting Storage Rings bringing the most important breakthroughs. Over the decades, the CERN vacuum group has merged surface-physics specialists, thin-film coating experts and galvanic-treatment professionals, together with teams of designers and colleagues dedicated to the operation of large vacuum equipment. In doing so, CERN has become one of the world’s leading R&D centres for extreme vacuum technology, contributing to major existing and future accelerator projects at CERN and beyond. With the HL-LHC in direct view, the vacuum team looks forward to attacking new challenges. For now, though, all eyes are on the successful restart of the CERN accelerator complex and the beginning of LHC Run 3.

The post The LS2 vacuum challenge appeared first on CERN Courier.

]]>
Feature An exhaustive programme of work spanning mechanical repairs and upgrades prepares the CERN vacuum systems for more luminous beam operations ahead. https://cerncourier.com/wp-content/uploads/2022/03/CCMarApr22_VACUUM-feature.jpg
Celebrating 20 years of n_TOF https://cerncourier.com/a/celebrating-20-years-of-n_tof/ Mon, 07 Feb 2022 14:08:48 +0000 https://preview-courier.web.cern.ch/?p=97260 The hybrid event highlighted the ongoing achievements of CERN's n_TOF facility and its nuclear science and applications.

The post Celebrating 20 years of n_TOF appeared first on CERN Courier.

]]>
n_TOF

The Neutron Time Of Flight (n_TOF) facility at CERN, a project proposed by former Director General Carlo Rubbia in the late 1990s, started operations in 2001. Its many achievements during the past two decades, and future plans in neutron science worldwide, were the subject of a one-day hybrid event NSTAPP – Neutrons in Science, Technology and Applications organised by the n_TOF collaboration at CERN on 22 November.

At n_TOF, a 20 GeV/c proton beam from the Proton Synchrotron (PS) strikes an actively cooled pure-lead  neutron spallation target. The generated neutrons are water-moderated to produce a spectrum that covers 11 orders of magnitude in energy from GeV down to meV. At the beginning, n_TOF was equipped with a single experimental station, located 185 m downstream from the spallation target. In 2014, a major upgrade saw the construction and operation of a new experimental test area located 20 m above the production target to allow measurements of very low-mass samples. Last year, during Long Shutdown 2, a new third-generation, nitrogen-cooled spallation target was installed and successfully commissioned to prolong the experiment’s lifetime by ten years. At the same time, a new close-to-target irradiation and experimental station called NEAR was added to perform activation measurements relevant nuclear astrophysics and measurements in collaboration with the R2E (Radiation to Electronics) project that are difficult at other facilities.

Advancing technology

During 20 years of activities, the n_TOF collaboration has carried out more than 100 experiments with considerable impact on nuclear astrophysics, advanced nuclear technologies and applied nuclear sciences, including novel medical applications. Understanding the origin of the chemical elements through the slow-neutron-capture process has been a particular highlight. The high instantaneous neutron flux, which is only available at n_TOF thanks to the short proton pulse delivered by the PS, provided key reaction rates relevant to big-bang nucleosynthesis and stellar evolution (the former attempting to explain the discrepancy between the predicted and existing amount of lithium by investigating 7Be creation and destruction, and the latter determining the chemical history of our galaxy).

Basic nuclear data are also essential for the development of nuclear-energy technology. It was this consideration that motivated Rubbia to propose a spallation neutron source at CERN in the first place, prompting a series of accurate neutron cross-section measurements on minor actinides and fission products. Neutron reaction processes on thorium, neptunium, americium, curium, in addition to minor isotopes of uranium and plutonium, have been all measured at n_TOF. These measurements provide the nuclear data necessary for the development of advanced nuclear systems, such as the increase of safety margins in existing nuclear plants as well as to enable generation-IV reactors and accelerator-driven systems, or even enabling new fuel cycles which reduce the amount of long-lived nuclear species.

Basic nuclear data are also essential for the development of nuclear-energy technology

Contributions from external laboratories, such as J-PARC (Japan), the Chinese Spallation Neutron Source (China), SARAF (Israel), GELINA (Belgium), GANIL (France) and Los Alamos (US), highlighted synergies in the measurement of neutron-induced capture, fission and light-charged-particle reactions for nuclear astrophysics, advanced nuclear technologies, and medical applications.  Moreover, technologies developed at CERN have also influenced the creation of two startups, Transmutex and Newcleo. The former focuses on accelerator-driven systems for energy production, for which the first physics validation was executed at the FEAT and TARC experiments at the CERN PS in 1999, while the latter plans to develop critical reactors based on liquid lead.

With the recent technical upgrades and the exciting physics programme in different fields, such as experiments focusing on the breaking of isospin symmetry in neutron-neutron scattering and pursuing its core experimental activities, the n_TOF facility has a bright future ahead.

The post Celebrating 20 years of n_TOF appeared first on CERN Courier.

]]>
Meeting report The hybrid event highlighted the ongoing achievements of CERN's n_TOF facility and its nuclear science and applications. https://cerncourier.com/wp-content/uploads/2022/02/n_TOF-feature-image.jpg
Plasmas on target in vacuum science https://cerncourier.com/a/plasmas-on-target-in-vacuum-science/ Mon, 10 Jan 2022 14:17:20 +0000 https://preview-courier.web.cern.ch/?p=96875 CERN vacuum experts explore the critical role that surface modification plays in large-scale vacuum systems.

The post Plasmas on target in vacuum science appeared first on CERN Courier.

]]>
Set-up to evaluate new cathodes for sputter deposition of amorphous carbon thin films

Within a particle accelerator, the surface of materials directly exposed to the beams interacts with the circulating particles and, in so doing, influences the local vacuum conditions through which those particles travel. Put simply: accelerator performance is linked inextricably to the surface characteristics of the vacuum beam pipes and chambers that make up the machine. 

In this way, the vacuum vessel’s material top surface and subsurface layer (just a few tens of nm thick) determine, among many other characteristics, the electrical resistance of the beam image current, synchrotron light reflectivity, degassing rates and secondary electron yield under particle bombardment. The challenge for equipment designers and engineers is that while the most common structural materials used to fabricate vacuum systems – stainless steel, aluminium alloys and copper – ensure mechanical resistance against atmospheric pressure, they do not deliver the full range of chemical and physical properties required to achieve the desired beam performance. 

Sputtering is one of the methods used to produce thin films by physical vapour deposition

Aluminium alloys, though excellent in terms of electrical conductivity, suffer from high secondary electron emission. On the latter metric, copper represents a better choice, but can be inadequate regarding gas desorption and mechanical performance. Even though it is the workhorse of vacuum technology, for its excellent mechanical and metallurgical behaviour, stainless steel lacks most of the required surface properties. The answer is clear: adapt the surface properties of these structural materials to the specific needs of the accelerator environment by coating them with more suitable materials, typically using electrochemical or plasma treatments. (For a review of electrochemical coating methods, see Surface treatment: secrets of success in vacuum science.) 

Variations on the plasma theme

The emphasis herein is exclusively on plasma-based thin-film deposition, in which an electrically quasi-neutral state of matter (composed of positive and negative charged particles) is put to work to re-engineer the physical and chemical properties of vacuum component/subsystem surfaces. A plasma can be produced by ionising gas atoms so that the positive charges are ions, and the negative ones are electrons. The most useful properties of the resultant gas plasma derive from the large difference in inertial mass between the particles carrying negative and positive charges. Owing to their much lower inertial mass, electrons are a lot more responsive than ions to variations of the electromagnetic field, leading to separation of charges and electrical fields within the plasma. What’s more, the particle trajectories for ions and electrons also differ markedly. 

Plasma sputtering sources

These characteristics can be exploited to deposit thin films and, more generally, to modify the properties of vacuum chamber and component surfaces. For such a purpose, noble-gas ions are extracted from a plasma and accelerated towards a negatively charged solid target. If the ions acquire enough kinetic energy (of the order of hundreds to thousands of eV), one of the effects of the bombardment is the extraction of neutral atoms from the target and their deposition on the surface of the substrate to be modified. This mechanism – called sputtering – is one of the methods used to produce thin films by physical vapour deposition (PVD), where film materials are extracted from a solid into a gas phase before condensing on a substrate.

In the plasma, the lost ions are reintroduced by electron ionisation of additional gas atoms. While the rate of ionisation is improved by increasing the gas density, an excessive gas density can have a detrimental effect on the sputtered atoms (as their trajectories are modified and their kinetic energy decreased by multiple collisions with gas atoms). The alternative is to increase the length of the electron trajectories by applying a magnetic field of several hundred gauss to the plasma. 

Contrary to ions – which are affected minimally – electrons move around the lines of force of the magnetic field in longer helical-like curves, such that the probability of hitting an atom is higher. As electrons are sooner or later lost – either on the growing film or nearby surfaces – the plasma is refilled by secondary electrons extracted from the target (as a result of ion collisions). For a given set of parameters – among them target voltage, plasma power, gas pressure and magnetic flux density – the plasma ultimately attains stable conditions and a constant rate of deposition. Typical film thicknesses for accelerator applications range from a few tens of nm to 2–3 microns.

Unique requirements

The peculiarities of thin-film deposition for accelerator applications lie in the size of the objects to be coated and the purity of the coatings in question. Substrate areas, for example, range from a few cm2 up to m2, and in a great variety of 3D shapes and geometries. Large-aspect-ratio beam pipes that are several metres long or complicated multicell cavities for RF applications are typical substrates regularly coated at CERN. The coating process is implemented either in dedicated workshops or directly inside the accelerators during the retrofitting of installed equipment. 

HiPIMS target geometries and coating parameters must be optimised for each family of accelerator components 

The simplest sputtering configuration can be deployed when coating a cylindrical beam pipe. The target, which is made of a wire or a rod of the material to be deposited, is aligned along the longitudinal axis of the beam pipe. Argon is the most commonly used noble gas, at a pressure that depends on the cross-section – i.e. the smaller the diameter, the higher the pressure (a typical value for vacuum chambers that are a few centimetres in diameter is 0.1 mbar). The plasma is ignited by polarising the target negatively (at a few hundred volts) using a DC power supply while keeping the pipe grounded. It’s possible to reduce the pressure by one or two orders of magnitude if a magnetic field is applied parallel to the target (owing to the longer electron paths). In this case, the deposition technique is known as DC magnetron sputtering. 

An in-vacuum cable spool for electrical powering of a movable sputtering target

When the substrate is not a simple cylinder, however, the target design becomes more complicated. That’s because of the need to accommodate different target–substrate distances, while the angle of incidence of sputtered atoms on the substrate is also subject to change. As a result, the deposited film might have different thicknesses and uneven properties at different locations on the substrate (owing to dissimilar morphologies, densities and defects, including voids). These weaknesses have been addressed, in large part, over recent years with a new family of sputtering methods called high-power impulse magnetron sputtering (HiPIMS). 

In HiPIMS, short plasma pulses (of the order of 10-100 μs) of high power density (kW/cm2 regime) are applied to the target. The discharge is shut down between two consecutive pulses for a duration of about 100–1000 μs; in this way, the duty cycle is low (less than 10%) and the average power ensures there is no overheating and deformation of the target. The resulting plasma, though, is about 10 times denser (approximately 1013 ions/cm3) versus standard DC magnetron sputtering – a figure of merit that, thanks to a bias voltage applied to the substrate, ensures a higher fraction of ionised sputtered atoms are transported to the surfaces to be coated. 

The impingement of such energetic ions produces denser films and reduces the columnar structure resulting from the deposition of sputtered atoms moving along lines of sight. As the bias voltage is not always a safe and practical solution, the CERN vacuum team has successfully tested the application of a positive pulse to the target immediately after the main negative pulse. The effect is an increase in energy of the ionised sputtered atoms, with equivalent results as per the bias voltage (though with a simpler implementation for accelerator components).  

Owing to the variety of materials and shapes encountered along a typical (or atypical) beamline, the HiPIMS target geometries and coating parameters must be optimised for each distinct family of accelerator components. This optimisation phase is traditionally experimental, based on testing and measurement of “coupon samples” and then prototypes. In the last five years, however, the CERN team has reinforced these experimental studies with 3D simulations based on a particle-in-cell Monte Carlo/direct simulation Monte Carlo (PICMS/DSMC) code – a capability originally developed at the Fraunhofer Institute for Surface Engineering and Thin Films (IST) in Braunschweig, Germany. 

Surface cleaning: putting plasmas to work

Notwithstanding their central role in thin-film deposition, plasmas are also used at CERN to clean surfaces for vacuum applications and to enhance the adherence of thin films. A case in point is the application of plasmas containing oxygen ions and free radicals (highly reactive chemical species) for the removal of hydrocarbons. In short: the ions and radicals are driven toward the contaminated surface, where they can decompose hydrocarbon molecules and form volatile species (e.g. CO and CO2) for subsequent evacuation. 

It’s a method regularly used to clean beryllium surfaces (which cannot be treated by traditional chemical methods for safety reasons). If the impingement kinetic energy of the oxygen ions is about 100 eV, the chemical reaction rate on the surface is much larger than the beryllium sputtering rate, such that cleaning is possible without producing hazardous compounds of the carcinogenic metal. 

Meanwhile, plasma treatments have recently been proposed for the cleaning of stainless-steel radioactive components when they are dismounted from accelerators, modified and then reinstalled. Using a remote plasma source, the energy of the plasma’s oxygen ions is chosen (<50 eV) so as to avoid sputtering of the component materials, thereby preventing radioactive atoms from entering the gas phase. The main difficulty here is to adapt the plasma source to the wealth of different geometries that are typical of accelerator components.

Plasma versatility

So much for the fundamentals of plasma processing, what of the applications? At CERN, the large-scale deployment of thin-film coatings began in the 1980s on the Large Electron–Positron (LEP) collider. To increase LEP’s collision energy to 200 GeV and above, engineering teams studied, and subsequently implemented, superconducting niobium (Nb) thin films deposited on copper (Cu) for the RF cavities (in place of bulk niobium). 

This technology was also adopted for the Large Hadron Collider (LHC), the High Intensity and Energy ISOLDE (HIE ISOLDE) project at CERN and other European accelerators operating at fields up to 15 MV/m. The advantages are clear: lower cost, better thermal stability (thanks to the higher thermal conductivity of the copper substrate), and reduced sensitivity to trapped magnetic fields. The main drawback of Nb/Cu superconducting RF cavities is an exponential growth of the power lost in an RF cycle with the accelerating electrical field (owing to resistivity and magnetic permeability of the Nb film). This weakness, although investigated extensively, has eluded explanation and proposed mitigation for the past 20 years. 

NEG coatings comprise a mixture of titanium, zirconium and vanadium

It’s only lately, in the frame of studies for the proposed electron–positron Future Circular Collider (FCC-ee), that researchers have shed light on this puzzling behaviour. Those insights are due, in large part, to a deeper theoretical analysis of Nb thin-film densification as a result of HiPIMS, though a parallel line of investigation involves the manufacturing of seamless copper cavities and their surface electropolishing. In both cases, the objective is the reduction of defects in the substrate to enhance film adherence and purity. 

UHV system equipped with an energy-resolved mass spectrometer for the characterisation of HiPIMS plasma discharges

Related studies have shown that Nb films on Cu can perform as well as bulk Nb in terms of superconducting RF properties, though coating materials other than Nb are also under investigation. Today, for example, the CERN vacuum group is evaluating Nb3Sn and V3Si – both of which are part of the A15 crystallographic group and exhibit superconducting transition temperatures of about 18 K (i.e. 9 K higher than Nb). This higher critical temperature would allow the use of RF cavities operating at 4.3 K (instead of 1.9 K), yielding significant simplification of the cryogenic infrastructure and reductions in electrical energy consumption. Even so, intense development is still necessary before these coatings can really challenge pure Nb films – not least because A15 films are brittle, plus the coating of such materials is tricky (given the need to reproduce a precise stoichiometry and crystallographic structure). 

Game-changing innovations

Another wide-scale application of plasma processing at CERN is in the deposition of non-evaporable-getter (NEG) thin-film coatings, specialist materials originally developed to provide distributed vacuum pumping for the LHC. NEG coatings comprise a mixture of titanium, zirconium and vanadium with a typical composition around 30:30:40, respectively. For plasma deposition of NEG films, the target (comprising three interlacing elemental wires) is pulled along the main axis of the beam pipes. Once the coated vacuum chambers are installed within an accelerator and pumped out, the NEG films undergo heating for 24 hours at temperatures ranging from 180 to 250 °C – a process known as activation, in which the superficial oxide layer and any contaminants are dissolved into their bulk. 

The clean surfaces obtained in this way chemically adsorb most of the gas species in the vacuum system at room temperature – except for noble gases (which are chemically inert) and methane (for which small auxiliary pumps are necessary). The NEG-coated surfaces provide an impressively high pumping speed and, thanks to their cleanliness, a lower desorption yield when bombarded by electrons, photons and ions – and all this with minimal space occupancy. Moreover, owing to their maximum secondary electron yields (δmax) below 1.3, NEG coatings avoid the development of electron multipacting, the main cause of electron clouds in beam pipes (and related unfavourable impacts on beam performance, equipment operation and cryogenic heat load). 

More broadly, plasma processing of NEG coatings represents a transformative innovation in the implementation of large-scale vacuum systems. Hundreds of beam pipes were NEG-coated for the long straight section of the LHC, including the experimental vacuum chambers inserted in the four gigantic detectors. Beyond CERN, NEG coatings have also been employed widely in other large scientific instruments, including the latest generation of synchrotron light sources.

Powerful surface analysis tools

In-situ capabilities

Of course, NEG coatings require thermal activation, so cannot be applied in vacuum systems that are unheatable (i.e. vacuum vessels that operate at cryogenic temperatures or legacy accelerators that may need retrofitting). For these specific cases, the CERN vacuum team has, over the past 15 years, been developing and iterating low-δmax carbon coatings comprised mostly of amorphous carbon (a-C) with prevailing graphitic-like bonding among the carbon atoms. 

Even though a-C thin films were originally studied for CERN’s older Super Proton Synchrotron (SPS), they are now the baseline solution for the beam screen of the superconducting magnets in the long straight section of the High-Luminosity LHC. A 100 nm thin coating is deposited either in the laboratory for the new magnets (located on both sides of the ATLAS and CMS detectors) or in situ for the ones already installed in the tunnel (both sides of LHCb and ALICE). 

Production of denser plasmas will be key for future applications in surface treatments for accelerators

The in situ processing has opened up another productive line of enquiry: the possibility of treating the surface of beam screens (15 m long, a few cm diameter) directly in the accelerators with the help of mobile targets. The expectation is that these innovative coating methods for a-C could, over time, also be applied to improve the performance of installed vacuum chambers in the LHC’s arcs, without the need to dismount magnets and cryogenic connections. 

Opportunity knocks

Looking ahead, the CERN vacuum team has plenty of ideas regarding further diversification of plasma surface treatments – though the direction of travel will ultimately depend on the needs of future studies, projects and collaborations. Near term, for example, there are possible synergies with the Advanced Proton Driven Plasma Wakefield Acceleration Experiment (AWAKE), an accelerator R&D project based at CERN that’s investigating the use of plasma wakefields driven by a proton bunch to accelerate charged particles over short distances. Certainly, the production of denser plasmas (and their manipulation) will be key for future applications in surface treatments for accelerators. 

Another area of interest is the use of plasma-assisted chemical vapour deposition to extend the family of mat­erials that can be deposited. For the longer term, the coating of vacuum systems with inert materials that allow the attainment of very low pressures (in the ultrahigh vacuum regime) in a short timeframe (five years) without bakeout remains one of the most ambitious targets.

The post Plasmas on target in vacuum science appeared first on CERN Courier.

]]>
Feature CERN vacuum experts explore the critical role that surface modification plays in large-scale vacuum systems. https://cerncourier.com/wp-content/uploads/2022/01/CCSupp_Vac_2022_SurfaceScience-frontis-feature.jpg
Detail and diligence ensure LS2 progress https://cerncourier.com/a/detail-and-diligence-ensure-ls2-progress/ Mon, 10 Jan 2022 14:17:08 +0000 https://preview-courier.web.cern.ch/?p=96855 CERN’s vacuum groups has completed an intense period of activity during LS2 to prepare the accelerator complex for more luminous operations.

The post Detail and diligence ensure LS2 progress appeared first on CERN Courier.

]]>
The LHCb detector

The reliability of the CERN vacuum systems is very much front-and-centre as the restart of the LHC physics programme approaches in mid-2022. The near-term priority is the recommencement of beam circulation in vacuum systems that were open to the air for planned interventions and modification – sometimes for several days or weeks – during Long Shutdown 2 (LS2), a wide-ranging overhaul of CERN’s experimental infrastructure that’s been underway since the beginning of 2019.  

With LS2 now drawing to a close and pilot beam already circulated in October for a general check of the accelerator chain, it’s worth revisiting the three operational objectives that CERN’s engineering teams set out to achieve during shutdown: consolidation of the LHC dipole diodes (essential safety elements for the superconducting magnets); the anticipation of several interventions required for the High-Luminosity LHC (HL–LHC) project (the successor to the LHC, which will enter operation in 2028); and the LHC Injectors Upgrade project to enhance the injection chain so that beams compatible with HL–LHC expectations can be injected into CERN’s largest machine. 

Paolo Chiggiato

“The CERN vacuum team has made fundamental contributions to the achievement of the LS2 core objectives and other parallel activities,” notes Paolo Chiggiato, head of the CERN vacuum, surfaces and coatings group. “As such, we have just completed an intense period of work in the accelerator tunnels and our laboratories, as well as running and supporting numerous technical workshops.” 

As for vacuum specifics, all of the LHC’s arcs were vented to the air after warm-up to room temperature; all welds were leak-checked after the diode consolidation (with only one leak found among the 1796 tests performed); while the vacuum team also replaced or consolidated around 150 turbomolecular pumps acting on the cryogenic insulation vacuum. In total, 2.4 km of non-evaporable-getter (NEG)-coated beampipes were also opened to the air at room temperature – an exhaustive programme of work spanning mechanical repair and upgrade (across 120 weeks), bakeout (spread across 90 weeks) and NEG activation (over 45 weeks). “The vacuum level in these beampipes is now in the required range, with most of the pressure readings below 10–10 mbar,” explains Chiggiato.  

Close control

Another LS2 priority for Chiggiato and colleagues involved upgrades to CERN’s vacuum control infrastructure, with the emphasis on reducing single points of failure and the removal of confusing architectures (i.e. systems with no clear separation of function amongst the different programmable logic controllers). “For the first time,” adds Chiggiato, “mobile vacuum equipment was controlled and monitored by wireless technologies – a promising communication choice for distributed systems and areas of the accelerator complex requiring limited stay.” 

Elsewhere, in view of the higher LHC luminosity (and consequent increased radioactivity) following LS2 works, the vacuum group developed and installed advanced radiation-tolerant electronics to control 100 vacuum gauges and valves in the LHC dispersion suppressors. This roll-out represents the first step of a longer-term campaign that will be scaled during the next Long Shutdown (LS3 is scheduled for 2025–2027), including the production of 1000 similar electronic cards for vacuum monitoring. “In parallel,” says Chiggiato, “we have renewed the vacuum control software – introducing resilient, easily scalable and self-healing web services technologies and frameworks used by some of the biggest names in industry.” 

Success breeds success

In the LHC experimental area, meanwhile, the disassembly of the vacuum chambers at the beginning of LS2 required 93 interventions and 550 person-hours of work in the equipment caverns. Reinstallation has progressed well in the four core LHC experiments, with the most impressive refit of vacuum hardware in the CMS and LHCb detectors. 

For the former, the vacuum team installed a new central beryllium chamber (internal diameter 43.4 mm, 7.3 m long), while 12 new aluminium chambers were manufactured, surface-finished and NEG-coated at CERN. Their production comprised eight separate quality checks, from surface treatment to performance assessment of the NEG coating. “The mechanical installation – including alignments, pump-down and leak detection – lasted two months,” explains Chiggiato, “while the bake-out equipment installation, bake-out process, post-bake-out tests and venting with ultrapure neon required another month.” 

Thankfully, creative problem-solving is part of the vacuum teams DNA

In LHCb, the team contributed to the new version of the Vertex Locator (VELO) sub-detector. The VELO’s job is to pick out B mesons from the multitude of other particles produced – a tricky task as their short lives will be spent close to the beam. To find them, the VELO’s RF box – a delicate piece of equipment filled with silicon detectors, electronics and cooling circuits – must be positioned perilously close to the point where protons collide. In this way, the sub-detector faces the beam at a distance of just 5 mm, with an aluminium window thinned down to 150 μm by chemical etching prior to the deposition of a NEG coating. 

As the VELO encloses the RF box, and both volumes are under separate vacuum, the pumpdown is a critical operation because pressure differences across the thin window must be lower than 10 mbar to ensure mechanical integrity. “This work is now complete,” says Chiggiato, “and vacuum control of the VELO is in the hands of the CERN vacuum team after a successful handover from specialists at Nikhef [the Dutch National Institute for Subatomic Physics].” 

Wrapping up a three-year effort, the vacuum team’s last planned activity in LS2 involves the bake-out of the ATLAS and CMS beampipe in early 2022. “There was no shortage of technical blockers and potential show-stoppers during our LS2 work programme,” Chiggiato concludes. “Thankfully, creative problem-solving is part of the vacuum team’s DNA, as is the rigorous application of vacuum best practice and domain knowledge accumulated over decades of activity. Ours is a collective mindset, moreover, driven by a humble approach to such complex technological installations, where every single detail can have important consequences.”

A detailed report on the CERN vacuum team’s LS2 work programme – including the operational and technical challenges along the way – will follow in the March/April 2022 issue of CERN Courier magazine.

The post Detail and diligence ensure LS2 progress appeared first on CERN Courier.

]]>
Feature CERN’s vacuum groups has completed an intense period of activity during LS2 to prepare the accelerator complex for more luminous operations. https://cerncourier.com/wp-content/uploads/2022/01/CCSupp_Vac_2022_LS2-frontis-feature.jpg
Thinking big in vacuum R&D https://cerncourier.com/a/thinking-big-in-vacuum-rd/ Mon, 10 Jan 2022 14:16:55 +0000 https://preview-courier.web.cern.ch/?p=96840 Head of the KIT Vacuum Lab, Christian Day, discusses the importance of an integrated approach to vacuum system development in which modelling, simulation and experimental validation all work in tandem.

The post Thinking big in vacuum R&D appeared first on CERN Courier.

]]>
TIMO - the lab’s large multipurpose vacuum vessel

Christian Day is a vacuum scientist on a mission – almost evangelically so. As head of the Vacuum Lab at Karlsruhe Institute of Technology (KIT), part of Germany’s renowned network of federally funded Helmholtz Research Centres, Day and his multidisciplinary R&D team are putting their vacuum know-how to work in tackling some of society’s “grand challenges”. 

Thinking big goes with the territory. As such, the KIT Vacuum Lab addresses a broad-scope canvas, one that acknowledges the core enabling role of vacuum technology in all manner of big-science endeavours – from the ITER nuclear-fusion research programme to fundamental studies of the origins of the universe (Day is also technical lead for cryogenic vacuum on the design study for the Einstein Telescope, a proposed next-generation gravitational-wave observatory).

Christian Day

Here, Day tells CERN Courier about the Vacuum Lab’s unique R&D capabilities and the importance of an integrated approach to vacuum system development in which modelling, simulation and experimental validation all work in tandem to foster process and technology innovation.

What are the long-term priorities of the KIT Vacuum Lab?

Our aim is to advance vacuum science and technology along three main pathways: an extensive R&D programme in collaboration with a range of university and scientific partners; design and consultancy services for industry; and vacuum education to support undergraduate and postgraduate science and engineering students at KIT. It’s very much a multidisciplinary effort, with a staff team of 20 vacuum specialists working across physics, software development and the core engineering disciplines (chemical, mechanical, electrical). They’re supported, at any given time, by a cohort of typically five PhD students. 

So what does that mean in terms of the teams core competencies?

At a headline level, we’re focused around the two fundamental challenges in modern vacuum science: the realisation of a physically consistent description of outgassing behaviours for a range of materials and vacuum regimes; also the development of advanced vacuum gas dynamics methods and associated supercomputer algorithms. 

As such, one of the main strengths of the KIT Vacuum Lab is our prioritisation of predictive code development alongside experimental validation – twin capabilities that enable us to take on the design, delivery and project-management of the most complex vacuum systems. The resulting work programme is nothing if not diverse – from very-large-scale vacuum pumping systems for nuclear fusion to contamination-free vacuum applications in advanced manufacturing (e.g. extreme UV lithography and solar-cell fabrication).

What sets the KIT Vacuum Lab apart from other vacuum R&D programmes?

Over the last 10 years or so, and very much driven by our contribution to the ITER nuclear fusion project in southern France, we have developed a unique and powerful software capability to model vacuum regimes at a granular level – from atmospheric pressure all the way down to extreme-high-vacuum (XHV) conditions (10–10 Pa and lower). This capability, and the massive computational resources that make it possible, are being put to use across all manner of advanced vacuum applications – quantum computing, HyperLoop transportation systems and gravitational-wave experiments, among others.

Mistakenly, early-career researchers often think that vacuum is a somehow old-fashioned service that they can buy off-the-shelf

The Vacuum Lab’s organising principles are built around “integrated process development”. What does that look like operationally?

It means we take a holistic view regarding the development of vacuum processes, which allows us to identify the main influences in the vacuum system and to map them theoretically or experimentally. An iterative design evolution must not only be based on efficient models; it must also be validated and parameterised by using experimental data from different levels of the process hierarchy. Experimental data are indispensable to evaluate the pros and cons of competing models and to quantify the uncertainties of model predictions. 

In turn, the department’s research structure is set up to address elementary processes and unit functions within a vacuum system. When choosing a vacuum pump, for example, it’s important to understand how the pump design, underlying technology and connectivity will influence other parts of the vacuum system. It’s also necessary, though too often forgotten, for the end-user to understand the ultimate purpose of the vacuum system – the why – so that they can address any issues arising in terms of the vacuum science fundamentals and underlying physics.

What are your key areas of emphasis in fusion research right now? 

Our vacuum R&D programme in nuclear fusion is carried out under the umbrella of EUROfusion, a consortium agreement signed by 30 research organisations and universities from 25 EU countries plus the UK, Switzerland and Ukraine. Collectively, the participating partners in EUROfusion are gearing up for the ITER experimental programme (due to come online in 2025), with a longer-term focus on the enabling technologies – including the vacuum systems – for a proof-of-principle fusion power plant called DEMO. The latter is still at the concept phase, though provisionally scheduled for completion by 2050. 

As EUROfusion project leader for the tritium fuel cycle, I’m overseeing KIT’s vacuum R&D inputs to the DEMO fusion reactor – a collective effort that we’ve labelled the Vacuum Pumping Task Force and involving multiple research/industry partners. The vacuum systems in today’s nuclear fusion reactors – including the work-in-progress ITER facility – rely largely on customised cryosorption pumps for vacuum pumping of the main reaction vessel and the neutral beam injector (essentially by trapping gases and vapours on an ultracold surface). DEMO, though, will require a portfolio of advanced pumping concepts to be developed for ongoing operations, including metal-foil and mercury-based diffusion and ring pumps as well as high-capacity non-evaporable-getter (NEG) materials (see “Next-generation pump designs: from ITER to the DEMO fusion project”). 

Next-generation pump designs: from ITER to the DEMO fusion project

By Yannick Kathage

When the ITER experimental reactor enters operation later this decade, nuclear fusion will be realised in a tokamak device that uses superconducting magnets to contain a hot plasma in the shape of a torus. Herein the fusion reaction between deuterium and tritium (DT) nuclei will produce one helium nucleus, one neutron and, in the process, liberate huge amounts of energy (that will heat up the walls of the reactor to be exploited in a power cycle for electricity production).

In this way, fusion reactors like ITER must combine large-scale vacuum innovation with highly customised pumping systems. The ITER plasma chamber (1400 m3), for example, will be pumped at fine vacuum (several Pa) against large gas throughputs in the course of the plasma discharge – the so-called burn phase for energy generation. There follows a dwell phase, when the chamber will be pumped down (for approximately 23 minutes) to moderately high vacuum (5 × 10–4 Pa), before initiating the next plasma discharge. Meanwhile, surrounding the plasma chamber is an 8500 m3 cryostat to provide a 10 mPa cryogenic insulation vacuum (required for the operation of the superconducting magnets). 

A key design requirement for all of ITER’s pumping systems is to ensure compatibility with tritium, a radioactive hydrogen isotope. Effectively, this rules out the use of elastomer seals (only metal joints are permitted) and the use of oil or lubricants (which are destroyed by tritium contamination). Specifically, the six torus exhaust systems are based on so-called discontinuous cryosorption pumps, cooled with supercritical helium gas at 5 K and coated with activated charcoal as sorbent material to capture helium (primarily), a mix of hydrogen isotopes and various impurities from the plasma chamber. 

As with all accumulation pumps, these cryopumps must be regenerated by heating on a regular basis. To provide a constant pumping speed on the torus during the plasma pulse, it’s therefore necessary to “build in” additional cryopumping capacity – such that four systems are always pumping, while the other two are in regeneration mode. What’s more, these six primary cryopumps are backed by a series of secondary cryopumps and, finally, by dry mechanical rough pumps that compress to ambient pressure. 

Scaling up for DEMO

The operational principle of the cryosorption pump means that large quantities of tritium accumulate within the sorbent material over time – a safety concern that’s very much on the radar of the ITER management team as well as Europe’s nuclear regulatory agencies. Furthermore, this “tritium inventory” will only be amplified in the planned future DEMO power plant, providing real impetus for the development of new, and fully continuous, pumping technologies tailored for advanced fusion applications. 

Testing and reconfiguration at the KIT lab

Among the most promising candidates in this regard is the so-called metal-foil pump (MFP), which uses a plasma source to permeate, and ultimately compress, a flux of pure hydrogen isotopes through a group V metal foil (e.g. niobium or vanadium) using an effect called superpermeation. The driving force here is an energy gradient in the gas species, up and downstream of the foil (due to plasma excitation, but largely independent of pressure). It’s worth noting that the KIT Vacuum Pumping Task Force initiated development work on the MFP concept five years ago, with a phased development approach targeting “mature technical exploitation” of superpermeation pumping systems by 2027.  

If ultimately deployed in a reactor context, the MFP will yield two downstream components: a permeated gas stream (comprising D and T, which will be cycled directly back into the fusion reactor) and an unprocessed stream (comprising D, T, He and impurities), which undergoes extensive post-processing to yield an additional D, T feedstock. It is envisaged that both gas streams, in turn, will be pumped by a train of continuously working rough pumps that use mercury as a working fluid (owing to the metal’s compatibility with tritium). As the DEMO plant will feature a multibarrier concept for the confinement of tritium, mercury can also be circulated safely in a closed-loop system. 

One of those alternative roughing pump technologies is also being developed by the KIT Vacuum Pumping Task Force – specifically, a full stainless-steel mercury ring pump that compresses to ambient pressure. Progress has been swift since the first pump-down curve with this set-up was measured (in 2013) and the task force now has a third-generation design working smoothly in the lab, albeit with all rotary equipment redesigned to take account of the fact that mercury has a specific weight 13 times greater than that of water (the usual operating fluid in a ring pump).

Hydrogen impacts

While mercury-based vacuum pumping is seen as a game-changer exclusively for fusion applications, it’s evident that the MFP is attracting wider commercial attention. That’s chiefly because superpermeation works only for the hydrogenic species in the gas mixture being pumped – thereby suggesting the basis of a scalable separation functionality. In the emerging hydrogen economy, for example, it’s possible that MFP technology, if suitably dimensioned, could be implemented to continuously separate hydrogen from the synthesis gas of classical gasification reactions (steam reforming) and at purities that can otherwise only be achieved via electrolytic processes (which require huge energy consumption). 

Put simply: MFP technology has the potential to significantly reduce the ecological footprint associated with the mass-production of pure hydrogen. As such, once the MFP R&D programme achieves sufficient technical readiness, the KIT Vacuum Lab will be seeking to partner with industry to commercialise the associated know-how and expertise. 

Yannick Kathage is a chemical engineering research student in the KIT Vacuum Lab.

How does all that translate into the unique facilities and capabilities within the KIT Vacuum Lab? 

The KIT Vacuum Lab is developing high-capacity NEG materials for applications in nuclear fusion research

Our work on fusion vacuum pumps requires specialist domain knowledge regarding, for example, the safe handling of mercury as well as how to manage, measure and mitigate the associated radioactivity hazard associated with tritium-compatible vacuum systems. We have set up a dedicated mercury lab, for example, to investigate the fluid dynamics of mercury diffusion pumps as well as a test station to optimise their performance at a system level. 

Many of the other laboratory facilities are non-standard and not found anywhere else. Our Outgassing Measurement Apparatus (OMA), for example, uses the so-called difference method for high-resolution measurements of very low levels of outgassing across a range of temperatures (from ambient to 570 K). The advantage of the difference method is that a second vacuum chamber, which is identical to the sample chamber, is used as a reference in order to directly subtract the background outgassing rate of the chamber. 

Meanwhile, our TransFlow facility allows us to generate fluid flows at different levels of rarefaction, and across a range of channel geometries, to validate our in-house code development. Even TIMO, our large multipurpose vacuum vessel – a workhorse piece of kit in any vacuum R&D lab – is heavily customised, offering temperature cycling from 450 K down to 4 K. 

What about future plans for the KIT Vacuum Lab?

A significant expansion of the lab is planned over the next four years, with construction of a new experimental hall to house a 1:1 scale version of the vacuum section of the DEMO fuel cycle. This facility – the catchily titled Direct Internal Recycling Integrated Development Platform Karlsruhe, or DIPAK – will support development and iteration of key DEMO vacuum systems and associated infrastructure, including a large vacuum vessel to replicate the torus – a non-trivial engineering challenge at 30 tonnes, 7 m long and 3.5 m in diameter. 

How do you attract the brightest and best scientists and engineers to the KIT Vacuum Lab?

The specialist teaching and lecture programme that the vacuum team provides across the KIT campus feeds our talent pipeline and helps us attract talented postgraduates. Early-career researchers often think – mistakenly – that vacuum is somehow old-fashioned and a “commoditised service” that they can buy off-the-shelf. Our educational outreach shows them otherwise, highlighting no shortage of exciting R&D challenges to be addressed in vacuum science and technology – whether that’s an exotic new pumping system for nuclear fusion or a low-outgassing coating for an accelerator beamline. 

The multidisciplinary nature of the vacuum R&D programme certainly helps to broaden our appeal, as does our list of high-profile research partners spanning fundamental science (e.g. the Weizmann Institute of Science in Tel Aviv), the particle accelerator community (e.g. TRIUMF in Canada) and industry (e.g. Leybold and Zeiss in Germany). Wherever they are, we’re always keen to talk to talented candidates interested in working with us.

The post Thinking big in vacuum R&D appeared first on CERN Courier.

]]>
Feature Head of the KIT Vacuum Lab, Christian Day, discusses the importance of an integrated approach to vacuum system development in which modelling, simulation and experimental validation all work in tandem. https://cerncourier.com/wp-content/uploads/2022/01/CCSupp_Vac_2022_KIT-TIMO_MWRL-feature.jpg
Vacuum solutions fuel fusion dreams https://cerncourier.com/a/vacuum-solutions-fuel-fusion-dreams/ Mon, 10 Jan 2022 14:16:41 +0000 https://preview-courier.web.cern.ch/?p=96823 ITER vacuum-section leader Robert Pearce describes the latest progress with one of the largest and most complex vacuum systems ever built.

The post Vacuum solutions fuel fusion dreams appeared first on CERN Courier.

]]>
ITER vacuum vessel being moved into the tokamak pit

Robert Pearce is all about the detail. That’s probably as it should be for the section leader of the diverse, sprawling vacuum ecosystem now taking shape as part of the work-in-progress ITER experimental reactor in southern France. When it comes online in the mid-2020s, this collaborative megaproject – which is backed by China, the European Union, India, Japan, Korea, Russia and the US – will generate nuclear fusion in a tokamak device (the world’s largest) that uses superconducting magnets to contain and control a hot plasma in the shape of a torus. In the process, ITER will also become the first experimental fusion machine to achieve “net energy” – when the total power produced during a fusion plasma pulse surpasses the power injected to heat the plasma – while providing researchers with a real-world platform to test the integrated technologies, materials and physics regimes necessary for future commercial production of fusion-based electricity. 

Robert Pearce giving a lecture to students

Vacuum reimagined

If ITER is big science writ large, then its myriad vacuum systems are an equally bold reimagining – at scale – of vacuum science, technology and innovation. “ITER requires one of the most complex vacuum systems ever built,” explains Pearce. “We’ve overcome a lot of challenges so far in the construction of the vacuum infrastructure, though there are doubtless more along the way. One thing is certain: we will need to achieve a lot of vacuum – across a range of regimes and with enabling technologies that deliver bulletproof integrity – to ensure successful, sustained fusion operation.” 

The task of turning the vacuum vision into reality falls to Pearce and a core team of around 30 engineers and physicists based at the main ITER campus at Cadarache. It’s a multidisciplinary effort, with domain knowledge and expertise spanning mechanical engineering, modelling and simulation, experimental validation, surface science, systems deployment and integration, as well as process control and instrumentation. At a headline level, the group is focused on delivery versus two guiding objectives. “We need to make sure all the vacuum systems are specified to our exacting standards in terms of leak tightness, cleanliness and optimal systems integration so that everything works together seamlessly,” notes Pearce. “The other aspect of our remit involves working with multiple partner organisations to develop, validate and implement the main pumping systems, vacuum chambers and distribution network.” 

The tokamak at the heart of the ITER construction site

Sharing the load

Beyond the main project campus, the two primary partners on the ITER vacuum programme are the Fusion for Energy (F4E) team in Barcelona, Spain, and US ITER in Oak Ridge, Tennessee, both of which support the vacuum effort through “in-kind” contributions of equipment and personnel to complement direct cash investments from the member countries. While the ITER Vacuum Handbook – effectively the project bible for all things vacuum – provides a reference point to shape best practice across vacuum hardware, associated control systems, instrumentation and quality management, there’s no one-size-fits-all model for the relationship between the Cadarache vacuum team and its partner network.

“We supply ‘build-to-print’ designs to Barcelona – for example, in the case of the large torus cryopump systems – and they, in collaboration with us, then take care of the procurement with their chosen industry suppliers,” explains Pearce. With Oak Ridge, which is responsible for provision of the vacuum auxiliary and roughing pumps systems (among other things), the collaboration is based on what Pearce calls “functional specification procurement…in which we articulate more of the functionality and they then work through a preliminary and final design with us”. 

Vacuum innovation: ITER’s impact dividend

While ITER’s vacuum team pushes the boundaries of what’s possible in applied vacuum science, industry partners are working alongside to deliver the enabling technology innovations, spanning one-of-a-kind pumping installations to advanced instrumentation and ancillary equipment. 

The ITER neutral beam injector systems – accelerators that will drive high-energy neutral particles into the tokamak to heat the fusion plasma – are a case in point. The two main injectors (each roughly the size of a locomotive) will be pumped by a pair of open-structure, panel-style cryosorption pumps (with a single pump measuring 8 m long and 2.8 m high). 

Working in tandem, the pumps will achieve a pumping speed of 4500 m3/s for hydrogen, with a robust stainless-steel boundary necessary for the cryogenic circuits to provide a confinement barrier between tritium (which is radioactive) and cryogenic helium. 

Key to success is a co-development effort – involving ITER engineers and industry partner Ravanat (France) – to realise a new manufacturing method for the fabrication of cryopanels via expansion of stainless-steel tube (at around 2000 bar) into aluminium extrusions. It’s a breakthrough, moreover, that delivers excellent thermal contact over the operating temperature range (4.5 K for pumping to 400 K for regeneration), while combining the robustness of stainless steel with the thermal conductivity of aluminium.

Industry innovation is also in evidence at a smaller scale. As the ITER project progresses to the active (nuclear) phase of operations, for example, human access to the cryostat will be very limited. With this in mind, the In-Pipe Inspection Tool (IPIT) is being developed for remote inspection and leak localisation within the tens of km of cryostat pipework. 

An R&D collaboration between ITER vacuum engineers, F4E and Italian robotics firm Danieli Telerobot, the IPIT is capable of deployment in small-bore pipework up to 45 m from the insertion point. The unit combines a high-resolution camera for inspection of welds internal to the pipe, as well as a dedicated “bladder” for isolation of vacuum leaks prior to repair. 

Other instrument innovations already well developed by industry to meet ITER’s needs include a radiation-hardened (> 1 MGy) and magnetic-field-compatible (> 200 mT) residual gas analyser that permits remote operation via a controller up to 140 m away (supplied by Hidden Analytical, UK); and also an optical diaphragm gauge (> 1 MGy, > 200 mT) with a measurement capability in line with the best capacitive manometers (a co-development between Inficon, Germany, and OpSens Solutions, Canada).

When it comes to downstream commercial opportunities, it’s notable that ITER member countries share the experimental results and any intellectual property generated by ITER during the development, construction and operation phases of the project.

More broadly, because vacuum is central to so many of ITER’s core systems – including the main tokamak vessel (1400 m3), the surrounding cryostat (16,000 m3) and the superconducting magnets – the vacuum team also has touch-points and dependencies with an extended network of research partners and equipment makers across ITER’s member countries. Unsurprisingly, with more than 300 pumping systems and 10 different pumping technologies to be deployed across the ITER plant, complexity is one of the biggest engineering challenges confronting Pearce and his team. 

“Once operational, ITER will have thousands of different volumes that need pumping across a range of vacuum regimes,” notes Pearce. “Overall, there’s high diversity in terms of vacuum function and need, though the ITER Vacuum Handbook does help to standardise our approach to issues like leak tightness, weld quality, testing protocols, cleanliness and the like.”

The ITER cryoplant

Atypical vacuum

Notwithstanding the complexity of ITER’s large-scale vacuum infrastructure, Pearce and his team must also contend with the atypical operational constraints in and around the fusion tokamak. For starters, many of the machine’s vacuum components (and associated instrumentation) need to be qualified for operation in a nuclear environment (the ITER tokamak and supporting plant must enclose and securely contain radioactive species like tritium) and to cope with strong magnetic fields (up to 7 T in the plasma chamber and up to 300 mT for the vacuum valves and instruments). In terms of qualification, it’s notable that ITER is being built in a region with a history of seismic activity – deliberately so, to demonstrate that a fusion reactor can be operated safely anywhere in the world. 

“Ultimately,” concludes Pearce, “any vacuum system – and especially one on the scale and complexity required for ITER – requires great attention to detail to be successful.”

The post Vacuum solutions fuel fusion dreams appeared first on CERN Courier.

]]>
Feature ITER vacuum-section leader Robert Pearce describes the latest progress with one of the largest and most complex vacuum systems ever built. https://cerncourier.com/wp-content/uploads/2022/01/CCSupp_Vac_2022_FUSION-lift-feature.jpg
MAX IV: partnership is the key https://cerncourier.com/a/max-iv-partnership-is-the-key/ Mon, 10 Jan 2022 14:16:27 +0000 https://preview-courier.web.cern.ch/?p=96863 State-of-the-art ultrahigh-vacuum technologies underpin the 3 GeV electron storage ring of Sweden's MAX IV light source.

The post MAX IV: partnership is the key appeared first on CERN Courier.

]]>
Sweden’s MAX IV synchrotron radiation facility

Sweden’s MAX IV synchrotron radiation facility is among an elite cadre of advanced X-ray sources, shedding light on the structure and behaviour of matter at the atomic and molecular level across a range of fundamental and applied disciplines – from clean-energy technologies to pharma and healthcare, from structural biology and nanotech to food science and cultural heritage. 

Marek Grabski, MAX IV’s vacuum section leader

In terms of core building blocks, this fourth-generation light source – which was inaugurated in 2016 – consists of a linear electron accelerator plus 1.5 and 3 GeV electron storage rings (with the two rings optimised for the production of soft and hard X rays, respectively). As well as delivering beam to a short-pulse facility, the linac serves as a full-energy injector to the two storage rings which, in turn, provide photons that are extracted for user experiments across 14 specialist beamlines.

Underpinning all of this is a ground-breaking implementation of ultrahigh-vacuum (UHV) technologies within MAX IV’s 3 GeV electron storage ring – the first synchrotron storage ring in which the inner surface of almost all the vacuum chambers along its circumference are coated with non-evaporable-getter (NEG) thin film for distributed pumping and low dynamic outgassing. Here, Marek Grabski, MAX IV vacuum section leader, gives CERN Courier the insider take on a unique vacuum installation and its subsequent operational validation. 

What are the main design challenges associated with the 3 GeV storage-ring vacuum system?

We were up against a number of technical constraints that necessitated an innovative approach to vacuum design. The vacuum chambers, for example, are encapsulated within the storage ring’s compact magnet blocks with bore apertures of 25 mm diameter (see “The MAX IV 3 GeV storage ring: unique technologies, unprecedented performance”). What’s more, there are requirements for long beam lifetime, space limitations imposed by the magnet design, the need for heat dissipation from incoming synchrotron radiation, as well as minimal beam-coupling impedance. 

The answer, it turned out, is a baseline design concept that exploits NEG thin-film coatings, a technology originally pioneered by CERN that combines distributed pumping of active residual gas species with low photon-stimulated desorption. The NEG coating was applied by magnetron sputtering to almost all the inner surfaces (98% lengthwise) of the vacuum chambers along the electron beam path. As a consequence, there are only three lumped ion pumps fitted on each standard “achromat” (20 achromats in all, with a single acromat measuring 26.4 m end-to-end). That’s far fewer than typically seen in other advanced synchrotron light sources. 

The MAX IV 3 GeV storage ring: unique technologies, unprecedented performance

Among the must-have user requirements for the 3 GeV storage ring was the specified design goal of reaching ultralow electron-beam emittance (and ultrahigh brightness) within a relatively small circumference (528 m). As such, the bare lattice natural emittance for the 3 GeV ring is 328 pm rad – more than an order of magnitude lower than typically achieved by previous third-generation storage rings in the same energy range.

Even though the fundamental concepts for realising ultralow emittance had been laid out in the early 1990s, many in the synchrotron community remained sceptical that the innovative technical solutions proposed for MAX IV would work. Despite the naysayers, on 25 August 2015 the first electron beam circulated in the 3 GeV storage ring and, over time, all design parameters were realised: the fourth generation of storage-ring-based light sources was born. 

Layout of the MAX IV lab and aerial view of the main facilities

Stringent beam parameters 

The MAX IV 3 GeV storage ring represents the first deployment of a so-called multibend achromat magnet lattice in an accelerator of this type, with the large number of bending magnets central to ensuring ultralow horizontal beam emittance. In all, there are seven bending magnets per achromat (and 20 achromats making up the complete storage ring). 

Not surprisingly, miniaturisation is a priority in order to accommodate the 140 magnet blocks – each consisting of a dipole magnet and other magnet types (quadrupoles, sextupoles, octupoles and correctors) – into the ring circumference. This was achieved by CNC machining the bending magnets from a single piece of solid steel (with high tolerances) and combining them with other magnet types into a single integrated block. All magnets within one block are mechanically referenced, with only the block as a whole aligned on a concrete girder.

Vacuum innovation

Meanwhile, the vacuum system design for the 3 GeV storage ring also required plenty of innovative thinking, key to which was the close collaboration between MAX IV and the vacuum team at the ALBA Synchrotron in Barcelona. For starters, the storage-ring vacuum vessels are made from extruded, oxygen-free, silver-bearing copper tubes (22 mm inner diameter, 1 mm wall thickness). 

Copper’s superior electrical and thermal conductivities are crucial when it comes to heat dissipation and electron beam impedance. The majority of the chamber walls act as heat absorbers, directly intercepting synchrotron radiation coming from the bending magnets. The resulting heat is dissipated by cooling water flowing in channels welded on the outer side of the vacuum chambers. Copper also absorbs unwanted radiation better than aluminium, offering enhanced protection for key hardware and instrumentation in the tunnel. 

The use of crotch absorbers for extraction of the photon beam is limited to one unit per achromat, while the section where synchrotron radiation is extracted to the beamlines is the only place where the vacuum vessels incorporate an antechamber. Herein the system design is particularly challenging, with the need for additional cooling blocks to be introduced on the vacuum chambers with the highest heat loads. 

Other important components of the vacuum system are the beam position monitors (BPMs), which are needed to keep the synchrotron beam on an optimised orbit. There are 10 BPMs in each of the 20 achromats, all of them decoupled thermally and mechanically from the vacuum chambers through RF-shielded bellows that also allow longitudinal expansion and small transversal movement of the chambers.

Ultimately, the space constraints imposed by the closed magnet block design – as well as the aggregate number of blocks along the ring circumference – was a big factor in the decision to implement a NEG-based pumping solution for MAX IV’s 3 GeV storage ring. It’s simply not possible to incorporate sufficient lumped ion pumps to keep the pressure inside the accelerator at the required level (below 1 × 10–9 mbar) to achieve the desired beam lifetime while minimising residual gas–beam interactions. 

Operationally, it’s worth noting that a purified neon venting scheme (originally developed at CERN) has emerged as the best-practice solution for vacuum interventions and replacement or upgrade of vacuum chambers and components. As evidenced on two occasions so far (in 2018 and 2020), the benefits include significantly reduced downtime and risk management when splitting magnets and reactivating the NEG coating. 

How important was collaboration with CERN’s vacuum group on the NEG coatings?

Put simply, the large-scale deployment of NEG coatings as the core vacuum technology for the 3 GeV storage ring would not have been possible without the collaboration and support of CERN’s vacuum, surfaces and coatings (VSC) group. Working together, our main objective was to ensure that all the substrates used for chamber manufacturing, as well as the compact geometry of the 3 GeV storage-ring vacuum vessels, were compatible with the NEG coating process (in terms of coating adhesion, thickness, composition and activation behaviour). Key to success was the deep domain knowledge and proactive technical support of the VSC group, as well as access to CERN’s specialist facilities, including the mechanical workshop, vacuum laboratory and surface treatment plant. 

What did the manufacturing model look like for this vacuum system? 

Because of the technology and knowledge transfer from CERN to industry, it was possible for the majority of the vacuum chambers to be manufactured, cleaned, NEG-coated and tested by a single commercial supplier – in this case, FMB Feinwerk- und Messtechnik in Berlin, Germany. Lengthwise, 70% of the chambers were NEG-coated by the same vendor. Naturally, the manufacturing of all chambers had to be compatible with the NEG coating, which meant careful selection and verification of materials, joining methods (brazing) and handling. Equally important, the raw materials needed to undergo surface treatment compatible with the coating, with the final surface cleaning certified by CERN to ensure good film adhesion under all operating conditions – a potential bottleneck that was navigated thanks to excellent collaboration between the three parties involved. 

To spread the load, and to relieve the pressure on our commercial supplier ahead of system installation (which commenced in late 2014), it’s worth noting that most geometrically complicated chambers (including vacuum vessels with a 5 mm vertical aperture antechamber) were NEG-coated at CERN. Further NEG coating support was provided through a parallel collaboration with the European Synchrotron Radiation Facility (ESRF) in Grenoble. 

How did you handle the installation phase? 

This was a busy – and at times stressful – phase of the project, not least because all the vacuum chambers were being delivered “just-in-time” for final assembly in situ. This approach was possible thanks to exhaustive testing and qualification of all vacuum components prior to shipping from the commercial vendor, while extensive dialogue with the MAX IV team helped to resolve any issues arising before the vacuum components left the factory. 

Owing to the tight schedule for installation – just eight months – we initiated a collaboration with the Budker Institute of Nuclear Physics (BINP) in Russia to provide additional support. For the duration of the installation phase, we had two teams of specialists from BINP working alongside (and coordinated by) the MAX IV vacuum team. All vacuum-related processes – including assembly, testing, baking and NEG activation of each achromat (at 180 °C) – took place inside the accelerator tunnel directly above the opened lower magnet blocks of MAX IV’s multibend achromat (MBA) lattice. Our installation approach, though unconventional, yielded many advantages – not least, a reduction in the risks related to transportation of assembled vacuum sectors as well as reduced alignment issues. 

Presumably not everything went to plan through installation and acceptance?

One of the issues we encountered during the initial installation phase was a localised peeling of the NEG coating on the RF-shielded bellows assembly of several vacuum vessels. This was addressed as a matter of priority – NEG film fragments falling into the beam path is a show-stopper – and all the effected modules were replaced by the vendor in double-quick time. More broadly, the experience of the BINP staff meant difficulties with the geometry of a few chambers could also be resolved on the spot, while the just-in-time delivery of all the main vacuum components worked well, such that the installation was completed successfully and on time. After completion of several achromats, we installed straight sections in between while the RF cavities were integrated and conditioned in situ. 

Magnet block, complete achromat and the vacuum installation team

How has the vacuum system performed from the commissioning phase and into regular operation? 

Bear in mind that MAX IV was the first synchrotron light source to apply NEG technology on such a scale. We were breaking new ground at the time, so there were credible concerns regarding the conditioning and long-term reliability of the NEG vacuum system – and, of course, possible effects on machine operation and performance. From commissioning into regular operations, however, it’s clear that the NEG pumping system is reliable, robust and efficient in delivering low dynamic pressure in the UHV regime.

Initial concerns around potential saturation of the NEG coating in the early stages of commissioning (when pressures are high) proved to be unfounded, while the same is true for the risk associated with peeling of the coating (and potential impacts on beam lifetime). We did address a few issues with hot-spots on the vacuum chambers during system conditioning, though again the overall impacts on machine performance were minimal. 

To sum up: the design current of 500 mA was successfully injected and stored in November 2018, proving that the vacuum system can handle the intense synchrotron radiation. After more than six years of operation, and 5000 Ah of accumulated beam dose, it is clear the vacuum system is reliable and provides sustained UHV conditions for the circulating beam – a performance, moreover, that matches or even exceeds that of conventional vacuum systems used in other storage rings.

What are the main lessons your team learned along the way through design, installation, commissioning and operation of the 3 GeV storage-ring vacuum system?

The unique parameters of the 3 GeV storage ring were delivered according to specification and per our anticipated timeline at the end of 2015. Successful project delivery was only possible by building on the collective experience and know-how of staff at MAX-lab (MAX IV’s predecessor) constructing and operating accelerators since the 1970s – and especially the lab’s “explorer mindset” for the early-adoption of new ideas and enabling technologies. Equally important, the commitment and team spirit of our technical staff, reinforced by our collaborations with colleagues at ALBA, CERN, ESRF and BINP, were fundamental to the realisation of a relatively simple, efficient and compact vacuum solution.

Operationally, it’s worth adding that there are many dependencies between the chosen enabling technologies in a project as complex as the MAX IV 3 GeV storage ring. As such, it was essential for us to take a holistic view of the vacuum system from the start, with the choice of a NEG pumping solution enforcing constraints across many aspects of the design – for example, chamber geometry, substrate type, surface treatment and the need for bellows. The earlier such knowledge is gathered within the laboratory, the more it pays off during construction and operation. Suffice to say, the design and technology solutions employed by MAX IV have opened the door for other advanced light sources to navigate and build on our experience.

The post MAX IV: partnership is the key appeared first on CERN Courier.

]]>
Feature State-of-the-art ultrahigh-vacuum technologies underpin the 3 GeV electron storage ring of Sweden's MAX IV light source. https://cerncourier.com/wp-content/uploads/2022/01/CCSupp_Vac_2022_MAXIV-build-feature.jpg
Linacs to narrow radiotherapy gap https://cerncourier.com/a/linacs-to-narrow-radiotherapy-gap/ Tue, 21 Dec 2021 13:37:26 +0000 https://preview-courier.web.cern.ch/?p=96625 Technology originally developed for high-energy physics is being used to develop a novel medical linear accelerator for radiotherapy in low- and middle-income countries.

The post Linacs to narrow radiotherapy gap appeared first on CERN Courier.

]]>
Number of people in African countries who have access to radiotherapy facilities

By 2040, the annual global incidence of cancer is expected to rise by more than 42% from 19.3 million to 27.5 million cases, corresponding to approximately 16.3 million deaths. Shockingly, some 70% of these new cases will be in low- and middle-income countries (LMICs), which lack the healthcare programmes required to effectively manage their cancer burden. While it is estimated that about half of all cancer patients would benefit from radiotherapy (RT) for treatment, there is a significant shortage of RT machines outside high-income countries.

More than 10,000 electron linear accelerators (linacs) are currently used worldwide to treat patients with cancer. But only 10% of patients in low-income and 40% in middle-income countries who need RT have access to it. Patients face long waiting times, are forced to travel to neighbouring regions or face insurmountable expenditure to access treatment. In Africa alone, 27 out of 55 countries have no linac-based RT facilities. In those that do, the ratio of the number of machines to people ranges from one machine to 423,000 people in Mauritius, one machine to almost five million people in Kenya and one machine to more than 100 million people in Ethiopia (see “Out of balance” image). In high-income countries such as the US, Switzerland, Canada and the UK, by contrast, the ratio is one RT machine to 85,000, 102,000, 127,000 and 187,000 people, respectively. To draw another stark comparison, Africa has approximately 380 linacs for a population of 1.2 billion while the US has almost 4000 linacs for a population of 331 million.

Unique challenges

It is estimated that to meet the demand for RT in LMICs over the next two to three decades, the current projected need of 5000 RT machines is likely to become more than 12,000. To put these figures into perspective, Varian, the market leader in RT machines, has a current worldwide installation base of 8496 linacs. While many LMICs provide RT using cobalt-60 machines, linacs offer better dose-delivery parameters and better treatment without the environmental and potential terrorism risks associated with cobalt-60 sources. However, since linacs are more complex and labour-intensive to operate and maintain, their current costs are significantly higher than cobalt-60 machines, both in terms of initial capital costs and annual service contracts. These differences pose unique challenges in LMICs, where macro- and micro-economic conditions can influence the ability of these countries to provide linac-based RT. 

The difficulties of operating electron guns

In November 2016 CERN hosted a first-of-its-kind workshop, sponsored by the International Cancer Expert Corps (ICEC), to discuss the design characteristics of RT linacs (see “Linac essentials” image) for the challenging environments of LMICs. Leading experts were invited from international organisations, government agencies, research institutes, universities and hospitals, and companies that produce equipment for conventional X-ray and particle therapy. The following October, CERN hosted a second workshop titled “Innovative, robust and affordable medical linear accelerators for challenging environments”, co-sponsored by the ICEC and the UK’s Science and Technology Facilities Council, STFC. Additional workshops have taken place in March 2018, hosted by STFC in collaboration with CERN and the ICEC, and in March 2019, hosted by STFC in Gaborone, Botswana (see “Healthy vision” image). These and other efforts have identified substantial opportunities for scientific and technical advancements in the design of the linac and the overall RT system for use in LMICs. In 2019, the ICEC, CERN, STFC and Lancaster University entered into a formal collaboration agreement to continue concerted efforts to develop this RT system. 

The idea of novel medical linacs is an excellent example of the impact of fundamental research on wider society

In June 2020, STFC funded a project called ITAR (Innovative Technologies towards building Affordable and equitable global Radiotherapy capacity) in partnership with the ICEC, CERN, Lancaster University, the University of Oxford and Swansea University. ITAR’s first phase was aimed at defining the persistent shortfalls in basic infrastructure, equipment and specialist workforce that remain barriers to effective RT delivery in LMICs. Clearly, a linac suitable for these conditions needs to be low-cost, robust and easy to maintain. Before specifying a detailed design, however, it was first essential to assess the challenges and difficulties RT facilities face in LMICs and in other demanding environments. Published in June 2021, an expansive study of RT facilities in 28 African countries was carried out and compared to western hospitals by the ITAR team to quantitatively and qualitatively assess and compare variables in several domains (see “Downtime” figure). The survey builds on a related 2018 study on the availability of RT services and barriers to providing such services in Botswana and Nigeria, which looked at the equipment maintenance logs of linacs in those countries and selected facilities in the UK.

Surveying the field

The absence of detailed data regarding linac downtime and failure modes makes it difficult to determine the exact impact of the LMIC environment on the performance of current technology. The ongoing ITAR design development and prototyping process identified a need for more information on equipment failures, maintenance and service shortcomings, personnel, training and country-specific healthcare challenges from a much larger representation of LMICs. A further-reaching ITAR survey obtained relevant information for defining design parameters and technological choices based on issues raised at the workshops. They include well-recognised factors such as ease and reliability of operation, machine self-diagnostics and a prominent display of impending or actual faults, ease of maintenance and repair, insensitivity to power interruptions, low power requirement and the consequent reduced heat production.

A standard medical linac

Based on the information from its surveys, ITAR produced a detailed specification and conceptual design for an RT linac that requires less maintenance, has fewer failures and offers fast repair. Over the next three years, under the umbrella of a larger project called STELLA (Smart Technologies to Extend Lives with Linear Accelerators) launched in June 2020, the project will progress to a prototype development phase at STFC’s Daresbury Laboratory. 

The design of the electron gun has been optimised to increase beam-capture. This has the dual advantage of reducing both the peak current required from the gun to deliver the requisite dose and “back bombardment”. It also allows for simpler replacement of the electron gun’s cathode by trained personnel (current designs require replacement of the full electron gun or even the full linac). Electron-beam capture is limited in medical linacs as the pulses from the electron gun are much longer in duration than the radiofrequency (RF) period, meaning electrons are injected at all RF phases. Some phases cause the bunch to be accelerated while others result in electrons being reflected back to the cathode. In typical linacs, less than 50% of electrons reach the target and many electrons reach the target with lower energies. In high-energy accelerators velocity bunching can be used to compress the bunch, however the space is limited in medical linacs and the energy gain per cell is often well in excess of the beam energy. To allow velocity bunching in a medical linac, the first cell needs to operate at a low gradient – such that less space is required to bunch as the average beam velocity is much lower and the deceleration is less than the beam energy. By adjusting the length of the first and second cells, the decelerated electrons can re-accelerate on the next RF cycle and synchronise with the accelerated electrons, capturing nearly all the electrons and transporting them to the target without a low-energy tail. This is achieved using techniques originally developed for the optimisation of klystrons as part of the Compact Linear Collider project at CERN. By adjusting cell-to-cell coupling, it is possible to make all the other cells at a higher gradient similar to a standard medical linac such that the total linac length remains the same (see “Strong coupling” figure).

Designing a Robust and Affordable Radiation Therapy Treatment System for Challenging Environments workshop participants

The electrical power supply in LMICs can often be variable and protection equipment to isolate harmonics between pieces of equipment is not always installed, hence it is critical to consider this when designing the electrical system for RT machines. This in itself is relatively straightforward but is not normally considered as part of a RT machine design.

The failure of multi-leaf collimators (MLCs), which alter the intensity of the radiation so that it conforms to the tumour volume via several individually actuated leaves, is a major linac downtime issue. Designing MLCs that are less prone to failure will play a key role in RT in LMICs, with studies ongoing into ways to simplify the design without compromising on treatment quality.

Building a workforce

Making it simpler to diagnose and repair faults on linacs is another key area that needs improvement. Given the limited technical staff training in some LMICs, when a machine fails it can be challenging for local staff to make repairs. In addition, components that are degrading can be missed by staff, leading to loss of valuable time to order spares. An important component of the STELLA project, led by ICEC, is to enhance existing and establish new twinning programmes that provide mentoring and training to healthcare professionals in LMICs to build workforce capacity and capability in those regions.

ITAR linac cavity geometry

The idea to address the need for a novel medical linac for challenging environments was first presented by Norman Coleman, senior scientific advisor to the ICEC, at the 2014 ICTR-PHE meeting in Geneva. This led to the creation of the STELLA project, led by Coleman and ICEC colleagues Nina Wendling and David Pistenmaa, which is now using technology originally developed for high-energy physics to bring this idea closer to reality – an excellent example of the impact of fundamental research on wider society. 

The next steps are to construct a full linac prototype to verify the higher capture, as well as to improve the ease of maintaining and repairing the machine. Then we need to have the RT machine manufactured for use in LMICs, which will require many practical and commercial challenges to be overcome. The aim of project STELLA to make RT truly accessible to all cancer patients brings to mind a quote from the famous Nigerian novelist Chinua Achebe: “While we do our good works let us not forget that the real solution lies in a world in which charity will have become unnecessary.” 

The post Linacs to narrow radiotherapy gap appeared first on CERN Courier.

]]>
Feature Technology originally developed for high-energy physics is being used to develop a novel medical linear accelerator for radiotherapy in low- and middle-income countries. https://cerncourier.com/wp-content/uploads/2021/12/dreamstime_s_91010156-696x465.jpg
Multidisciplinary CERN forum tackles AI https://cerncourier.com/a/multidisciplinary-cern-forum-tackles-ai/ Tue, 21 Dec 2021 11:07:04 +0000 https://preview-courier.web.cern.ch/?p=96719 The structure was designed to stimulate sew insights, dialogue and collaboration between AI specialists, scientists, philosophers and ethicists.

The post Multidisciplinary CERN forum tackles AI appeared first on CERN Courier.

]]>
Anima Anandkumar

The inaugural Sparks! Serendipity Forum attracted 49 leading computer scientists, policymakers and related experts to CERN from 17 to 18 September for a multidisciplinary science-innovation forum. In this first edition, participants discussed a range of ethical and technical issues related to artificial intelligence (AI), which has deep and developing importance for high-energy physics and its societal applications. The structure of the discussions was designed to stimulate interactions between AI specialists, scientists, philosophers, ethicists and other professionals with an interest in the subject, leading to new insights, dialogue and collaboration between participants.

World-leading cognitive psychologist Daniel Kahneman opened the public part of the event by discussing errors in human decision making, and their impact on AI. He explained that human decision making will always have bias, and therefore be “noisy” in his definition, and asked whether AI could be the solution, pointing out that AI algorithms might not be able to cope with the complexity of decisions that humans have to make. Others speculated as to whether AI could ever achieve the reproducibility of human cognition – and if the focus should shift from searching for a “missing link” to considering how AI research is actually conducted by making the process more regulated and transparent.

Introspective AI

Participants discussed both the advantages and challenges associated with designing introspective AI, which is capable of examining its own processes and could be beneficial in making predictions about the future. Participants also questioned, however, whether we should be trying to make AI more self-aware and human-like. Neuroscientist Ed Boyden explored introspection through the lens of neural pathways, and asked whether we can design introspective AI before we understand introspection in brains. Following the introspection theme, philosopher Luisa Damiano addressed the reality versus fiction of “social-embodied” AI – the idea of robots interacting with us in our physical world – arguing that such a possibility would require careful ethical considerations. 

AI is already a powerful, and growing, tool for particle physics

Many participants advocated developing so-called “strong” AI technology that can solve problems it has not come across before, in line with specific and targeted goals. Computer scientist Max Welling explored the potential for AI to exceed human intelligence, and suggested  that AI can potentially be as creative as humans, although further research is required. 

On the subject of ethics, Anja Kaspersen (former director of the UN Office for Disarmament Affairs) asked: who makes the rules? Linking to military, humanitarian and technological affairs, she considered how our experience in dealing with nuclear weapons could help us deal with the development of AI. She said that AI is prone to ethics washing: the process of creating an illusory sense that ethical issues are being appropriately addressed when they are not. Participants agreed that we should seek to avoid polarising the community when considering risks associated with current and future AI, and suggested a more open approach to deal with the challenges faced by AI today and tomorrow. Skype co-founder Jann Tallin identified AI as one of the most worrying existential risks facing society today; the fact that machines do not consider whether their decisions are unethical demands that we consider the constraints of the AI design space within the realm of decision making. 

Fruits of labour

The initial outcomes of the Sparks! Serendipity Forum are being written up as a CERN Yellow Report, and at least one paper will be submitted to the journal Machine Learning Science and Technology. Time will tell what other fruits of the serendipitous interactions at Sparks! will bring. One thing is certain, however, AI is already a powerful, and growing, tool for particle physics. Without it, the LHC experiments’ analyses would have been much more tortuous, as discussed by Jennifer Ngadiuba and Maurizio Pierini (CERN Courier September/October 2021 p31)

Future editions of the Sparks! Seren­dipity Forum will tackle different themes in science and innovation that are relevant to CERN’s research. The 2022 event will be built around future health technologies, including the many accelerator, detector and simulation technologies that are offshoots of high-energy-physics research. 

The post Multidisciplinary CERN forum tackles AI appeared first on CERN Courier.

]]>
Meeting report The structure was designed to stimulate sew insights, dialogue and collaboration between AI specialists, scientists, philosophers and ethicists. https://cerncourier.com/wp-content/uploads/2021/12/CCJanFeb22_Fieldnotes-Sparks.jpg
Training future experts in the fight against cancer https://cerncourier.com/a/training-future-experts-in-the-fight-against-cancer/ Tue, 21 Dec 2021 11:00:49 +0000 https://preview-courier.web.cern.ch/?p=96731 The Heavy Ion Therapy Masterclass school was the first event of the European Union project HITRIplus, in which CERN is a strategic partner.

The post Training future experts in the fight against cancer appeared first on CERN Courier.

]]>

The leading role of CERN in fundamental research is complemented by its contribution to applications for the benefit of society. A strong example is the Heavy Ion Therapy Masterclass (HITM) school, which took place from 17 to 21 May 2021. Attracting more than 1000 participants from around the world, many of whom were young students and early-stage researchers, the school demonstrated the enormous potential to train the next generation of experts in this vital application. It was the first event of the European Union project HITRIplus (Heavy Ion Therapy Research Integration), in which CERN is a strategic partner along with other research infrastructures, universities, industry partners, the four European heavy-ion therapy centres and the South East European International Institute for Sustainable Technologies (SEEIIST). As part of a broader “hands-on training” project supported by the CERN and Society Foundation with emphasis on capacity building in Southeast Europe, the event was originally planned to be hosted in Sarajevo but was held online due to the pandemic. 

The school’s scientific programme highlighted the importance of developments in fundamental research for cancer diagnostics and treatment. Focusing on treatment planning, it covered everything needed to deliver a beam to a tumour target, including the biological response of cancerous and healthy tissues. The Next Ion Medical Machine Study (NIMMS) group delivered many presentations from experts and young researchers, starting from basic concepts to discussions of open points and plans for upgrades. Expert-guided practical sessions were based on the matRad open-source professional toolkit, developed by the German cancer research centre DKFZ for training and research. Several elements of the course were inspired by the International Particle Therapy Masterclasses.  

Virtual visits to European heavy-ion therapy centres and research infrastructures were ranked by participants among the most exciting components of the course. There were also plenty of opportunities for participants to interact with experts in dedicated sessions, including a popular session on entrepreneurship by the CERN Knowledge Transfer group. This interactive approach had a big impact on participants, several of which were motivated to pursue careers in related fields and to get actively involved at their home institutes. This future expert workforce will become the backbone for building and operating future heavy-ion therapy and research facilities that are needed to fight cancer worldwide (see Linacs to narrow radiotherapy gap).

Further support is planned at upcoming HITRIplus schools on clinical and medical aspects, as well as HITRIplus internships, to optimally access existing European heavy-ion therapy centres and contribute to relevant research projects. 

The post Training future experts in the fight against cancer appeared first on CERN Courier.

]]>
Meeting report The Heavy Ion Therapy Masterclass school was the first event of the European Union project HITRIplus, in which CERN is a strategic partner. https://cerncourier.com/wp-content/uploads/2021/12/heavy-Ion-masterclass-image-crop.png
Overview of the ITER project, and our variable experiences in the development of some critical components of the magnets https://cerncourier.com/a/overview-of-the-iter-project-and-our-variable-experiences-in-the-development-of-some-critical-components-of-the-magnets/ Wed, 17 Nov 2021 16:36:48 +0000 https://preview-courier.web.cern.ch/?p=96317 This webinar is available to watch now, sponsored by New England Wire Technologies, RadiaSoft LLC and Agilent Technologies.

The post Overview of the ITER project, and our variable experiences in the development of some critical components of the magnets appeared first on CERN Courier.

]]>
By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

ITER has now reached the stage where about half of the large magnet components have arrived on site and many more are nearing completion at manufacturing locations distributed throughout the ITER partners. Although we still have several years of challenging on-site assembly ahead, the acceptance tests and first-of-a-kind assembly are teaching us a lot about the magnet quality and possible improvements for future tokamaks.

The webinar, presented by Neil Mitchell, will summarise the present status of manufacturing and assembly. Neil will then chose three areas, critical to magnet and tokamak performance, to describe in more detail:

1. Development of Nb3Sn strands for fusion applications started in the 1980s and the selection of the material for the Toroidal and Central Solenoid Coils in the first phase of ITER 1988–1991 was a key driver of the overall tokamak parameters. The development, qualification and procurement, both before and after the decision to use it, gives us an unusual opportunity to look at the implementation of a novel technology in its entirety, with the expected and unexpected problems we encountered and how they were solved – or tolerated.

2. High-voltage insulation in superconducting magnets is a frequently overlooked area that demands many new technologies. It is the area in the ITER magnets that has created the most quality issues on magnet acceptance and is clearly an area where more engineering attention is required.

3. The need for improvements in overall integration of the magnets into the tokamak, and in particular maintainability and repairability, is being demonstrated as we assemble components into the cryostat. The assembly is proceeding well in terms of quality but at the same time, the complexity shows that for a nuclear power plant, we need improvements.

Want to learn more on this subject?

After completing his PhD at Cambridge University on the fluid mechanics of turbomachinery, Neil Mitchell entered the nuclear fusion world in 1981 during the completion of the JET tokamak, and participated extensively in the early superconducting strand and conductor development programme of the EU in the 1980s, as well as in the design/manufacturing of several small copper-magnet-based magnetic fusion devices, including COMPASS at UKAEA. He was involved in the prototype manufacturing and testing of the superconductors that eventually became the main building blocks of the ITER magnets, and participated in the development and first tests of facilities such as Fenix at LLNL and Sultan at PSI. He has filled several positions within the ITER project after joining as one of the founder members in 1988, in particular, as the section leader for the ITER conductor in the 1990s with the highly successful construction and test of the CSMC in Japan and TFMC in Europe, and then after as division head responsible for the magnet procurement. He was responsible for finalising the magnet design, negotiating the magnet in-kind procurement agreements with the ITER Home Institutes and direct contracts, following and assisting the industrial production qualification and ramp up in multiple suppliers in EU, Japan, Korea, China, US and Russia. The ITER conductor production was completed in 2016 and now with the completion of the first-of-kind magnets, the delivery to the site of several coils and the placement of the first PF coil in the cryostat, he is working as an advisor to the ITER director. He is deeply involved in problem solving in the interfaces to the ITER on-site construction as the ITER magnets are delivered, contributing to the magnet control and commissioning plans, and advising the EU on the design of a next-generation fusion reactor.





The post Overview of the ITER project, and our variable experiences in the development of some critical components of the magnets appeared first on CERN Courier.

]]>
Webinar This webinar is available to watch now, sponsored by New England Wire Technologies, RadiaSoft LLC and Agilent Technologies. https://cerncourier.com/wp-content/uploads/2021/11/2021-12-16-CERN-webinar-image.jpg
CERN unveils roadmap for quantum technology https://cerncourier.com/a/cern-unveils-roadmap-for-quantum-technology/ Thu, 04 Nov 2021 14:25:58 +0000 https://preview-courier.web.cern.ch/?p=96222 Exploring the potential of quantum information science and technologies for high-energy physics.

The post CERN unveils roadmap for quantum technology appeared first on CERN Courier.

]]>
Quantum Technology Initiative

Launched one year ago, the CERN Quantum Technology Initiative (QTI) will see high-energy physicists and others play their part in a global effort to bring about the next “quantum revolution”, whereby phenomena such as superposition and entanglement are exploited to build novel computing, communication, sensing and simulation devices (CERN Courier September/October 2020 p47). 

On 14 October, the CERN QTI coordination team announced a strategy and roadmap to establish joint research, educational and training activities, set up a supporting resource infrastructure, and provide dedicated mechanisms for exchange of knowledge and technology. Oversight for the CERN QTI will be provided by a newly established advisory board composed of international experts nominated by CERN’s 23 Member States.

As an international, open and neutral platform, describes the roadmap document, CERN is uniquely positioned to act as an “honest broker” to facilitate cross-disciplinary discussions between CERN Member States and to foster innovative ideas in high-energy physics and beyond. This is underpinned by several R&D projects that are already under way at CERN across four main areas: quantum computing and algorithms; quantum theory and simulation; quantum sensing, metrology and materials; and quantum communication and networks. These projects target applications such as quantum-graph neural networks for track reconstruction, quantum support vector machines for particle classification, and quantum generative adversarial networks for physics simulation, as well as new sensors and materials for future detectors, and quantum-key-distribution protocols for distributed data analysis.

Education and training are also at the core of the CERN QTI. Building on the success of its first online course on quantum computing, the initiative plans to extend its academia–industry training programme to build competencies across different R&D and engineering activities for the new generation of scientists, from high-school students to senior researchers. 

Co-chairs of the CERN QTI advisory board, Kerstin Borras and Yasser Omar, stated: “The road map builds on high-quality research projects already ongoing at CERN, with top-level collaborations, to advance a vision and concrete steps to explore the potential of quantum information science and technologies for high-energy physics”.

The post CERN unveils roadmap for quantum technology appeared first on CERN Courier.

]]>
News Exploring the potential of quantum information science and technologies for high-energy physics. https://cerncourier.com/wp-content/uploads/2021/11/CCNovDec21_NA_QTI.jpg
2021 Nobel Prize recognises complexity https://cerncourier.com/a/2021-nobel-prize-recognises-complexity/ Thu, 04 Nov 2021 14:18:26 +0000 https://preview-courier.web.cern.ch/?p=96227 Giorgio Parisi, Klaus Hasselmann and Syukuro Manabe made groundbreaking contributions to the understanding of complex physical systems, such as Earth's climate.

The post 2021 Nobel Prize recognises complexity appeared first on CERN Courier.

]]>
Parisi, Hasselmann and Manabe

On 5 October, Syukuro Manabe (Princeton), Klaus Hasselmann (MPI for Meteorology) and Giorgio Parisi (Sapienza University of Rome) were announced as the winners of the 2021 Nobel Prize in Physics for their groundbreaking contributions to the understanding of complex physical systems, which provided rigorous scientific foundations to our understanding of Earth’s climate. Sharing half the 10 million Swedish kronor award, Manabe and Hasselmann were recognised “for the physical modelling of Earth’s climate, quantifying variability and reliably predicting global warming”. Parisi, who started out in high-energy physics, received the other half of the award “for the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales”.

In the early 1960s, Manabe developed a radiative-convective model of the atmosphere and explored the role of greenhouse gases in maintaining and changing the atmosphere’s thermal structure. It was the beginning of a decades-long research programme on global warming that he undertook in collaboration with the Geophysical Fluid Dynamics Laboratory, NOAA. Hasselmann, who was founding director of the Max Planck Institute for Meteorology in Hamburg from 1975 to 1999, developed techniques that helped establish the link between anthropogenic CO2 emissions and rising global temperatures. He published a series of papers in the 1960s on non-linear interactions in ocean waves, in which he adapted Feynman-diagram formalism to classical random-wave fields.

Parisi, a founder of the study of complex systems, enabled the understanding and description of many different and apparently entirely random materials and phenomena in physics, biology and beyond, including the flocking of birds. Early in his career, he also made fundamental contributions to particle physics, the most well-known being the derivation, together with the late Guido Altarelli and others, of the “DGLAP” QCD evolution equations for parton densities. “My mentor Nicola Cabibbo was usually saying that we should work on a problem only if working on the problem is fun,” said Parisi following the announcement. “So I tried to work on something that was interesting and which I believed that had some capacity to add something.”

As per last year, the traditional December award ceremony will take place online due to COVID-19 restrictions. 

The post 2021 Nobel Prize recognises complexity appeared first on CERN Courier.

]]>
News Giorgio Parisi, Klaus Hasselmann and Syukuro Manabe made groundbreaking contributions to the understanding of complex physical systems, such as Earth's climate. https://cerncourier.com/wp-content/uploads/2021/11/CCNovDec21_NA_Nobel_feature.jpg
Making complexity irrelevant https://cerncourier.com/a/making-complexity-irrelevant/ Thu, 04 Nov 2021 14:02:22 +0000 https://preview-courier.web.cern.ch/?p=96271 Headed by two ATLAS physicists, gluoNNet applies data-mining and machine-learning techniques to benefit wider society.

The post Making complexity irrelevant appeared first on CERN Courier.

]]>
One day’s worth of flight data

Describing itself as a big-data graph-analytics start-up, gluoNNet seeks to bring data analysis from CERN into “real-life” applications. Just two years old, the 12-strong firm based in Geneva and London has already aided clients with decision making by simplifying open-to-public datasets. With studies predicting that in three to four years almost 80% of data and analytics innovations may come from graph technologies, the physicist-based team aims to be the “R&D department” for medium-sized companies and help them evaluate massive volumes of data in a matter of minutes.

gluoNNet co-founder and president Daniel Dobos, an honorary researcher at the Lancaster University, first joined CERN in 2002, focusing on diamond and silicon detectors for the ATLAS experiment. A passion to share technology with a wider audience soon led him to collaborate with organisations and institutes outside the field. In 2016 he became head of foresight and futures for the United Nations-hosted Global Humanitarian Lab, which strives to bring up-to-date technology to countries across the world. Together with co-founder and fellow ATLAS collaborator Karolos Potamianos, an Ernest Rutherford Fellow at the University of Oxford, the pair have been collaborating on non-physics projects since 2014. An example is the THE Port Association, which organises in-person and online events together with CERN IdeaSquare and other partners, including “humanitarian hackathons”.

CERN’s understanding of big data is different to other’s

Daniel Dobos

gluoNNet was a natural next step to bring data analysis from high-energy physics into broader applications. It began as a non-profit, with most work being non-commercial and helping non-governmental organisations (NGOs). Working with UNICEF, for example, gluoNNet tracked countries’ financial transactions on fighting child violence to see if governments were standing by their commitments. “Our analysis even made one country – which was already one of the top donors – double their contribution, after being embarrassed by how little was actually being spent,” says Dobos.

But Dobos was quick to realise that for gluoNNet to become sustainable it had to incorporate, which it did in 2020. “We wanted to take on jobs that were more impactful, however they were also more expensive.” A second base was then added in the UK, which enabled more ambitious projects to be taken on.

Tracking flights

One project arose from an encounter at CERN IdeaSquare. The former head of security of a major European airline had visited CERN and noticed the particle-tracking technology as well as the international and collaborative environment; he believed something similar was needed in the aviation industry. During the visit a lively discussion about the similarities between data in aviation and particle tracking emerged. This person later became a part of the Civil Aviation Administration of Kazakhstan, which gluoNNet now works with to create a holistic overview of global air traffic (see image above). “We were looking for regulatory, safety and ecological misbehaviour, and trying to find out why some airplanes are spending more time in the air than they were expected to,” says Kristiane Novotny, a theoretical physicist who wrote her PhD thesis at CERN and is now a lead data scientist at gluoNNet. “If we can find out why, we can help reduce flight times, and therefore reduce carbon-dioxide emissions due to shorter flights.” 

Using experience acquired at CERN in processing enormous amounts of data, gluoNNet’s data-mining and machine-learning algorithms benefit from the same attitude as that at CERN, explains Dobos. “CERN’s understanding of big data is different to other’s. For some companies, what doesn’t fit in an Excel sheet is considered ‘big data’, whereas at CERN this is miniscule.” Therefore, it is no accident that most in the team are CERN alumni. “We need people who have the CERN spirit,” he states. “If you tell people at CERN that we want to get to Mars by tomorrow, they will get on and think about how to get there, rather than shutting down the idea.”

Though it’s still early days for gluoNNet, the team is undertaking R&D to take things to the next level. Working with CERN openlab and the Middle East Technical University’s Application and Research Center for Space and Accelerator Technologies, for example, gluoNNet is exploring the application of quantum-computing algorithms (namely quantum-graph neural networks) for particle-track reconstruction, as well as industrial applications, such as the analysis of aviation data. Another R&D effort, which originated at the Pan European Quantum Internet Hackathon 2019, aims to make use of quantum key distribution to achieve a secure VPN (virtual private network) connection. 

One of gluoNNet’s main future projects is a platform that can provide an interconnected system for analysts and decision makers at companies. The platform would allow large amounts of data to be uploaded and presented clearly, with Dobos explaining, “Companies have meetings with data analysts back and forth for weeks on decisions; this could be a place that shortens these decisions to minutes. Large technology companies start to put these platforms in place, but they are out of reach for small and medium sized companies that can’t develop such frameworks internally.”

The vast amounts of data we have available today hold invaluable insights for governments, companies, NGOs and individuals, says Potamianos. “Most of the time only a fraction of the actual information is considered, missing out on relationships, dynamics and intricacies that data could reveal. With gluoNNet, we aim to help stakeholders that don’t have in-house expertise in advanced data processing and visualisation technologies to get insights from their data, making its complexity irrelevant to decision makers.”

The post Making complexity irrelevant appeared first on CERN Courier.

]]>
Careers Headed by two ATLAS physicists, gluoNNet applies data-mining and machine-learning techniques to benefit wider society. https://cerncourier.com/wp-content/uploads/2021/11/CCNovDec21_CAREERS_gluonet.jpg
ITER powers ahead https://cerncourier.com/a/iter-powers-ahead/ Wed, 03 Nov 2021 13:05:38 +0000 https://preview-courier.web.cern.ch/?p=95712 Assembly of the tokamak that will confine a 150-million-degree plasma inside the ITER fusion experiment is under way.

The post ITER powers ahead appeared first on CERN Courier.

]]>
A D-shaped toroidal magnet coil

At the heart of the ITER fusion experiment is an 18 m-tall, 1000-tonne superconducting solenoid – the largest ever built. Its 13 T field will induce a 15 MA plasma current inside the ITER tokamak, initiating a heating process that ultimately will enable self-sustaining fusion reactions. Like all-things ITER, the scale and power of the central solenoid is unprecedented. Fabrication of its six niobium-tin modules began nearly 10 years ago at a purpose-built General Atomics facility in California. The first module left the factory on 21 June and, after traveling more than 2400 km by road and then crossing the Atlantic, the 110 tonne component arrived at the ITER construction site in southern France on 9 September. During a small ceremony marking the occasion, the director of engineering and projects for General Atomics described the job as: “among the largest, most complex and demanding magnet programmes ever undertaken” and “the most important and significant project of our careers.”

The US is one of seven ITER members, along with China, the European Union, India, Japan, Korea and Russia, who ratified an international agreement in 2007. Each member shares in the cost of project construction, operation and decommissioning, and also in the experimental results and any intellectual property. Europe is responsible for the largest portion of construction costs (45.6%), with the remainder shared equally by the other members. Mirroring the successful model of collider experiments at CERN, the majority (85%) of ITER-member contributions are to be delivered in the form of completed components, systems or buildings – representing untold hours of highly skilled work both in the member states and at the ITER site. 

First plasma

Assembly of the tokamak, which got under way in 2020, marks an advance to a crucial new phase for the ITER project. Production of its 18 D-shaped coils that provide the toroidal magnetic field, each 17 m high and weighing 350 tonnes, is in full swing, while its circular poloidal coils are close to completion. The remaining solenoid modules and all other major tokamak components are scheduled to be on site by mid-2023. Despite the impact of the global pandemic, the ITER teams are working towards the baseline target for “first plasma” by the end of 2025, with more than 2000 persons on site each day. 

A plasma in a torus-shaped tokamak

ITER’s purpose is to demonstrate the scientific and technological feasibility of fusion power for peaceful purposes. Key objectives are defined for this demonstration, namely: production of 500 MW of fusion power with a ratio of fusion power to input heating power (Q) of at least 10 for at least 300 seconds, and sustainment of fusion power with Q = 5 consistent with steady-state operation. The key factor in reaching these objectives is the world’s largest tokamak, a concept whose name comes from a Russian acronym roughly translated “toroidal chamber with magnetic coils”. This could also describe CERN’s Large Hadron Collider (LHC), but as we will see, the two magnetic confinement schemes are significantly different.

Among the largest, most complex and demanding magnet programmes ever undertaken

ITER chose deuterium and tritium (heavier variants of ordinary hydrogen) for its fuel because the D–T cross-section is the highest of all known fusion reactions. However, the energy at which the cross-section is maximum (~65 keV) is equivalent to almost 1 billion degrees. As a result, the fuel will no longer be in the form of gas as it is introduced but in the plasma state, where it is broken down to its electrically charged components (ions and electrons). As in the LHC, the electric charge introduces the possibility to hold the ions and electrons in place using magnetic fields generated by electromagnets – in both cases by superconducting magnets held at temperatures near absolute zero to avoid massive electrical consumption.  

ITER’s cryostat base

A simple picture of how the magnets in ITER work together to confine a plasma with temperatures greater than 100 million degrees begins with the toroidal field coils  (see “Trapping a plasma” figure). Eighteen of these are arranged to make a magnetic field that is circular-centered on a vertical line. Charged particles, to the crudest approximation, follow the magnetic field, so it would seem that the problem of confining them is solved. However, at the next level of approximation, the charged particles actually make small “gyro-orbits”, like beads on a wire. This introduces a difficulty because the “gyroradius” of these orbits depends on the strength of the magnetic field, and the toroidal magnetic field increases in strength closer to the vertical line defining its centre. This means that the gyroradius is smaller on the inner part of the orbit, which leads to a vertical motion of the charged particles. Since the direction of motion depends on the charge of the particle, however, the opposite charges move away from each other. This makes a vertical electric field which, when combined with the toroidal field, rapidly expels charged particles radially outward – eliminating confinement! Two Russian physicists, Tamm and Sakharov, proposed the idea in the 1950s that a current flowing in the plasma in the toroidal direction would generate a net helical field and charged particles flowing along the total field would short out the electric field, leading to confinement. This was the invention of the tokamak magnetic confinement concept.  

Magnetic configuration

In ITER, this current is generated by the powerful central solenoid, aligned on the vertical line at the centre of the toroidal field. It acts as the primary winding of a transformer, with the plasma as the secondary. There remains one more issue to address, again with magnets. The pressure and current in the plasma result in a force that tries to push the plasma further from the vertical line at the centre. To counter this force in ITER, six “poloidal field” coils are aligned – again about the vertical centerline – to generate vertical fields that push the plasma back toward the vertical line and also shape the plasma in ways that enhance the performance. A number of correction coils will complete ITER’s complex magnetic configuration, which will demonstrate the deployment of the Nb3Sn conductor – the same as is being implemented for high-field accelerator magnets at the High-Luminosity LHC and as proposed for future colliders – on a massive scale. CERN signed a collaboration agreement with ITER in 2008 concerning the design of high-temperature superconducting current leads and other magnet technologies, and acted as one of the “reference” laboratories for testing ITER’s superconducting strands. 

The first of ITER’s poloidal field coils

Despite the pandemic disrupting production and transport, the first step of ITER’s tokamak assembly sequence – the installation of the base of the cryostat into the tokamak bioshield – was achieved in May 2020. The ITER cryostat, which must be made of non-magnetic stainless steel, will keep the entire (30 m diameter by 30 m high) tokamak assembly at the low temperatures necessary for the magnets to function. It comes in four pieces (base, lower and upper cylinders, and lid) that are welded together in the tokamak building. At 1250 tonnes, the cryostat-base lift was the heaviest of the entire assembly sequence, its successful completion officially starting the assembly sequence (see “Heavy lifting” image). Later in 2020, the lower cylinder was then installed and welded to the base. 

Bottle up

With the “bottle” to hold the tokamak placed in position, installation of the electromagnets could begin. The two poloidal field coils at the bottom of the tokamak, PF6 and PF5, had to be installed first. PF6 was placed inside the cryostat earlier this year (see “Poloidal descent” image), while the second was lifted into place this September. The next big milestone is the assembly and installation of the first “sector” of the tokamak. The vacuum vessel in which the fusion plasma is made is divided into nine equal sectors (like the slices of an orange), due to limitations on the lifting capacity and to facilitate parallel fabrication of these large objects. Each sector of the vacuum vessel (see “Monster moves” image) has two toroidal field coils associated with it. 

ITER vacuum-vessel sector

In August, this vacuum vessel and its associated thermal shields were assembled together with the toroidal field coils on the sector sub-assembly tool for the first time (see “Shaping up” image). Once joined into a single unit, it will be installed in the cryostat in late 2021. The second vacuum-vessel sector arrived on site in August and will be assembled with the two associated toroidal-field coils already on site, with a target to install the final unit in the cryostat early in 2022. Sector components are scheduled to arrive, be put together, and then installed in the cryostat and welded together in assembly-line fashion, with the closure of the vacuum vessel scheduled for the end of 2023. The six central-solenoid modules are also to be assembled outside the cryostat into a single structure and installed in the cryostat shortly before closure. Following the arrival of the first module this summer, the second is complete and ready for shipping. Of the remaining four niobium-titanium poloidal field magnets, three are being fabricated on-site because they are too large to transport by road and all four are in advanced stages of production.  

Of course, there is more to ITER than its tokamak. In parallel, work on the supporting plant is under way. Four large transformers, which draw the steady-state electrical power from the grid, have been in operation since early 2019, while the medium- and low-voltage load centres that power clients in the plant buildings have been turned over to the operations division. The secondary and tertiary cooling systems, the chilled water and demineralised water plants, and the compressed-air and breathable-air plants are also currently being commissioned. The three large transformers that connect the pulsed power supplies for the magnets and the plasma heating systems have been qualified for operation on the 400 kV grid. The next big steps are the start of functional testing of the cryoplant and the reactive power compensation at the end of this year, and of the magnet power supplies and the first plasma heating system early in 2022. 

The 180-hectare ITER site

Perhaps the most common question one encounters when talking about ITER is: when will tokamak operations begin? Following the closure of the vacuum vessel in 2023, the current baseline schedule includes one year of installation work inside the cryostat before its closure, followed by integrated commissioning of the tokamak in 2025, culminating in “first plasma” by the end of 2025. By mandate from ITER’s governing body, the ITER Council, this schedule was put into place in 2016 as the “fastest technically achievable”, meaning no contingency. Clearly the pandemic has impacted the ability to meet that schedule, but the actual impact is still not possible to determine accurately. The challenge in this assessment is that 85% of the ITER components are delivered as in-kind contributions from the ITER members, and the pandemic has affected and continues to affect the manufacturing work on items that take years to complete. The components now being installed were substantially complete at the onset of the pandemic, but even these deliveries have encountered difficulties due to the disruption of the global shipping industry. Component installation in the tokamak complex has also been impacted by limited availability of components, goods and services. The possibility of recovery actions or further restrictions is not possible to predict with the needed accuracy today. In this light, the ITER Council has challenged us to do the best possible effort to maintain the baseline schedule, while preparing an assessment of the impact for consideration of a revised baseline schedule next year. The ITER Organization, domestic agencies in the ITER members responsible for supplying in-kind components, and contractors and suppliers around the world are working together to meet this additional challenge.  

What the future holds

ITER is expected to operate for 20 years, providing crucial information about both the science and the technology necessary for a fusion power plant. For the science, beyond the obvious interest in meeting ITER’s performance objectives, qualitative frontiers will be crossed in two essential areas of plasma physics. First, ITER will be the first “burning” plasma, where the dominant heating power to sustain the fusion output comes directly from fusion itself. Aspects of the relevant physics have been studied for many years, but the operating point of ITER places it in a fundamentally different regime from present experiments. The same is true of the second frontier: the handling of heat and particle exhaust in ITER. There is a qualitative difference predicted by our best simulation capabilities between the ITER operating point and present experiments. This is also the first touch-point between the physics and the technology: the physics must enable the survival of the wall, while the wall must allow the plasma physics to yield the conditions needed for the fusion reactions. Other essential technologies such as the means to make new fusion fuel (tritium), recycling of the fuel in use in real-time and remote handling for maintenance activities will all be pioneered in ITER.

ITER will provide crucial information about both the science and technology necessary for a fusion power plant

While ITER will demonstrate the potential for fusion energy to become the dominant source of energy production, harnessing that potential requires the demonstration not just of the scientific and technical capabilities but of the economic feasibility too. The next steps along that path are true demonstration power plants – “DEMOs” in fusion jargon – that explore these steps. ITER members are already exploring DEMO options, but no commitments have yet been made. The continuing advance of ITER is critical not just to motivate these next steps but also as a vision of a future where the world is powered by an energy source with universally available fuel and no impact on the environment. What a tremendous gift that would be for future generations.

The post ITER powers ahead appeared first on CERN Courier.

]]>
Feature Assembly of the tokamak that will confine a 150-million-degree plasma inside the ITER fusion experiment is under way. https://cerncourier.com/wp-content/uploads/2021/10/CCNovDec21_ITER_frontis.jpg
World’s most powerful MRI unveiled https://cerncourier.com/a/worlds-most-powerful-mri-unveiled/ Mon, 01 Nov 2021 09:34:17 +0000 https://preview-courier.web.cern.ch/?p=96123 The 11.7 T magnet driving CEA’s Iseult project is rooted in technology transfer with fusion and particle physics.

The post World’s most powerful MRI unveiled appeared first on CERN Courier.

]]>
A 132 tonne superconducting magnet has set a new record for whole-body magnetic-resonance imaging (MRI), producing a field of 11.7 T inside a 0.9 m diameter and 5 m long volume. Four-times more powerful than typical hospital devices, the “Iseult” project at CEA-Paris-Saclay paves the way for imaging the brain in unprecedented detail for medical research.

Using a pumpkin as a suitably brain-like subject, the team released its first images on 7 October, validating the system and demonstrating an initial resolution of 400 microns in three dimensions. Other checks and approvals are necessary before the first imaging of human volunteers can begin.

This work will undoubtedly lead to major clinical applications

Stanislas Dehaene

“Thanks to this extraordinary MRI, our researchers are looking forward to studying the anatomical and structural organization of the brain in greater detail. This work will undoubtedly lead to major clinical applications,” said Stanislas Dehaene, director of NeuroSpin, the neuroimaging platform at CEA-Paris-Saclay.

The magnets that drive tens of thousands of MRI devices worldwide perform the vital task of aligning the magnetic moments of hydrogen atoms.Then, RF pulses are used to momentarily disturb this order in a specific region, after which the atoms are pulled back into equilibrium by the magnetic field, and radiate. The stronger the field, the higher the signal-to-noise ratio, and thus better image resolution.

Niobium-titanium

In addition to being the largest and most powerful MRI magnet ever built, claims the team, the Iseult solenoid (carrying a current of 1.5 kA) also sets a record for the highest ever field achieved using niobium-titanium conductor, the same as is used in the present LHC magnets. With various optimisations, and working with the European Union Aroma project on methodologies for optimal functioning of the new MRI device, a resolution approaching 100 to 200 microns is planned, around ten times higher than commercial 3T devices.

Designed and built over ten years, Iseult was jointly led by neuroscientists and magnet and MRI specialists at the CEA Institute of Research into the Fundamental Laws of the Universe (IRFU) and the Frédéric Joliot Institute for Life Sciences, along with several industry and academic partnerships in Germany. Although CERN was not directly involved, Iseult’s success is anchored in more than four decades of joined developments between CERN and the CEA, explains Anne-Isabelle Etienvre, head of CEA IRFU:

“It is thanks to the know-how developed for particle physics and fusion that MRI experts had the idea to ask us to design and build this unique and challenging magnet for MRI — in particular, CEA has played a major role, together with CERN and other partners, on LHC magnets, the ATLAS toroidal magnets and the CMS solenoid,” says Etienvre. “The collaboration between CEA and CERN is still very lively, in particular for advanced magnets for future accelerators.

The post World’s most powerful MRI unveiled appeared first on CERN Courier.

]]>
News The 11.7 T magnet driving CEA’s Iseult project is rooted in technology transfer with fusion and particle physics. https://cerncourier.com/wp-content/uploads/2021/11/1100x619_cmsv2_6c0dc55a-de0b-5001-ab28-530bcd772e2b-6134576.jpg
Rare isotopes aplenty at FRIB https://cerncourier.com/a/rare-isotopes-aplenty-at-frib/ Mon, 27 Sep 2021 13:05:23 +0000 https://preview-courier.web.cern.ch/?p=94960 The Facility for Rare Isotope Beams in Michigan underpins an ambitious programme to transform nuclear physics and its applications.

The post Rare isotopes aplenty at FRIB appeared first on CERN Courier.

]]>
The 400 kW SRF linac

The $730 million Facility for Rare Isotope Beams (FRIB) at Michigan State University (MSU) is scheduled to come online in early 2022 – a game-changer in every sense for the US and international nuclear-physics communities. With peer review and approval of the first round of experimental proposals now complete, an initial cohort of scientists from 25 countries is making final preparations to exploit FRIB’s unique capabilities. Their goal: to open up new frontiers in the fundamental study of rare and unstable isotopes as well as identifying promising candidate isotopes for real-world applications. 

The engine-room of the FRIB scientific programme is an all-new 400 kW superconducting radiofrequency (SRF) linac. In short: the world’s most powerful heavy-ion driver accelerator, firing beams of stable isotopes at targets of heavier nuclei (for example, carbon or beryllium). Amid the chaos of flying particles, two nuclei will occasionally collide, fusing to form a rare and unstable isotope – a process that ultimately delivers high-intensity beams of rare isotopes to FRIB’s experimental end-stations and a suite of scientific instruments. 

Funded by the US Department of Energy Office of Science (DOE-SC), and supported by MSU cost-share and contributions, FRIB will operate as a traditional big-science user facility, with beam-time granted via merit review of proposals and access open to all interested researchers. Here, FRIB’s scientific director, Bradley Sherrill, tells CERN Courier how the laboratory is gearing up for “go-live” and the importance of wide-ranging engagement with the international user community, industry and other rare-isotope facilities.

What are the overarching objectives of the FRIB scientific mission?

Bradley Sherrill

There are four main strands to the FRIB science programme. For starters, user experiments will generate a wealth of data to advance our understanding of the nucleus – how it’s put together and how we can develop theoretical nuclear models and their approximations. At the same time, the research programme will yield unique insights on the origins of the chemical elements in the universe, providing access to most of the rare isotopes involved in extreme astrophysical processes such as supernovae and neutron-star mergers. Other scientists, meanwhile, will use isotopes produced at FRIB to devise experiments that look beyond the Standard Model, searching for subtle indications of hidden interactions and minutely broken symmetries. Finally, FRIB will generate research quantities of rare isotopes to feed into R&D efforts on next-generation applications – from functional medical imaging to safer nuclear reactors and advanced detector technologies.

What is FRIB’s biggest differentiator?  

The 400 kW SRF linac is the heart of FRIB’s value proposition to the research community, opening up access to a much broader spectrum of rare isotopes than hitherto possible – in fact, approximately 80% of the isotopes predicted to exist. It is worth noting, though, that FRIB does not exist in isolation. It’s part of a global research ecosystem, with a network of collaborations ongoing with other rare-isotope facilities – among them RIKEN’s RI Beam Factory in Japan, RAON in Korea, ISOLDE at CERN, FAIR in Germany, GANIL in France and ISAC at TRIUMF in Canada. Collectively, FRIB and this global network of laboratories are well placed to deliver unprecedented – and complementary – advances across the nuclear-science landscape over the coming decades.

Is it realistic to expect broader commercial opportunities to emerge from FRIB’s research programme? 

There’s a high likelihood of FRIB yielding spin-off technologies and commercial applications down the line. One of the game-changers with FRIB is the quantities of rare isotopes the beamline can produce with high efficiency – a production scheme that enables us to make a broad swathe of isotopes relatively quickly and with high purity. That capability, in turn, will enable potential early-adopters in industry to fast-track the evaluation of novel applications and, where appropriate, to figure out how to produce the isotopes of interest at scale (see “FRIB’s bumper harvest will fuel applied science and innovation”). 

How is FRIB engaging with the scientific user community across academia, industry and government agencies? 

FRIB enjoys strong links with its future users – both here in the US and internationally – and meets with them regularly at planning events to identify and coordinate research opportunities. Earlier this year, in response to our first call for proposals, we received 82 project submissions and six letters of intent from 130 institutions across 30 countries. Those science proposals were subsequently peer-reviewed by the FRIB Programme Advisory Committee (PAC), an international group of nuclear science experts which I convene, to yield an initial set of experiments that will get underway once FRIB commences user operations in early 2022. 

Those PAC-recommended experiments align with national science priorities across the four FRIB priority areas: properties of rare isotopes; nuclear astrophysics; fundamental interactions; and applications for society. The headline numbers saw 34 (out of 82 requested) experiments approved with a projected 4122 facility-use hours. There are 88 institutions, 24 US states and 25 countries represented in the initial experimental programme.

FRIB’s bumper harvest will fuel applied science and innovation

An excess of useful radioisotopes will be formed as FRIB fulfils its basic science mission of providing rare-isotope beams to feed a broad-scope international user programme. For the FRIB beams to reach high purity, though, the vast majority of these “surplus” isotopes will end up discarded in a water-filled beam dump – stranded assets that go unused and remain largely unexplored. 

With this in mind, the DOE-SC Office of Nuclear Physics, through the DOE Isotope Programme, has awarded FRIB scientists $13 million in funding over the next four years to build up FRIB’s isotope harvesting capabilities. The hope is that systematic recovery of the surplus isotopes – without impacting FRIB’s primary users – could open up novel lines of enquiry in applied research – from biochemistry to nuclear medicine, and from radiothermal generators to nuclear-weapons stockpile stewardship.

“This grant is about broadening the scientific impact of FRIB,” says Greg Severin, lead investigator for the harvesting project at FRIB. “While physicists at FRIB are making ground-breaking fundamental discoveries, our team will be supporting exciting opportunities in applied science.”

In 2018, the DOE-SC awarded Severin and colleagues an initial grant to prove that isotope harvesting is feasible. Their proof-of-concept involved building a small-scale isotope harvester in FRIB’s predecessor, the National Superconducting Cyclotron Laboratory at MSU. 

Now, with follow-on funding secured, Severin’s team is scaling up, with construction of a dedicated Isotope Harvesting Vault at FRIB in the works and set for completion in 2024.

See also “Isotope harvesting at FRIB: additional opportunities for scientific discovery” (J. Phys. G: Nucl. Part. Phys. 2019 46 100501). 

What are the opportunities for early-career scientists and engineers at FRIB?

Developing the talent pipeline is part of the organisational DNA here at FRIB. There’s a structured educational framework to pass on the expertise and experience of senior FRIB staff to the next generation of researchers, engineers and technicians in nuclear science. MSU’s Accelerator Science and Engineering Traineeship (ASET) programme is a case in point. ASET leverages multidisciplinary expertise from FRIB and MSU colleagues to support specialisation in four key areas: physics and engineering of large accelerators; SRF technology; radiofrequency power engineering; and large-scale cryogenic systems. 

Theres a high likelihood of FRIB yielding new spin-off technologies as well as commercial applications

Many MSU ASET students supplement their courses through participation in the US Particle Accelerator School, a national programme that provides graduate-level training and workforce development in the science of particle beams and associated accelerator technologies. At a more specialist level, there’s also the MSU Cryogenic Initiative, a unique educational collaboration between the university’s college of engineering and FRIB’s cryogenics team. Meanwhile, we continue to prioritise development of a more diverse workforce, partnering with several academic institutions that traditionally serve under-represented groups to broaden participation in the FRIB programme. 

In what ways does FRIB ensure a best-practice approach to facilities management? 

Sustainability and continuous improvement underpin all FRIB working practices. We are an ISO14001-registered organisation, which means we measure ourselves against an international standard specifying requirements for effective environmental management. That’s reflected, for example, in our use of energy-efficient superconducting technologies, and also our efforts to minimise any helium wastage through an exhaustive capture, recovery and reuse scheme within FRIB’s cryogenic plant. 

We also have an ISO 9001-registered quality management system that guides how we address scientific user needs; an ISO 45001-registered occupational health and safety management system to keep our workers safe; and an ISO 27001-registered information security management system.

How important is FRIB’s relationship with industry?

Our strategic partnerships with industry are also significant in driving organisational efficiencies. The use of standard industry components wherever possible reduces maintenance and training requirements, minimises the need for expensive product inventory, and lowers our operational costs. We engage with manufacturers on a co-development basis, fast-tracking innovation and knowledge transfer so that they are able to produce core enabling technologies for FRIB at scale – whether that’s accelerator cavities, superconducting magnets, or vacuum and cryogenic subsystems.  

The post Rare isotopes aplenty at FRIB appeared first on CERN Courier.

]]>
Opinion The Facility for Rare Isotope Beams in Michigan underpins an ambitious programme to transform nuclear physics and its applications. https://cerncourier.com/wp-content/uploads/2021/09/CCUSASupp21_FRIB_linac.jpg
‘First light’ beckons as LCLS-II gears up https://cerncourier.com/a/first-light-beckons-as-lcls-ii-gears-up/ Mon, 27 Sep 2021 13:05:16 +0000 https://preview-courier.web.cern.ch/?p=94932 An ambitious upgrade of SLAC's X-ray free-electron-laser facility – the Linac Coherent Light Source – is nearing completion.

The post ‘First light’ beckons as LCLS-II gears up appeared first on CERN Courier.

]]>
LCLS-II linac tunnel and future laser

An ambitious upgrade of the US’s flagship X-ray free-electron-laser facility – the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory in California – is nearing completion. Set for “first light” in 2022, LCLS-II will deliver X-ray laser beams that are 10,000 times brighter than LCLS at repetition rates of up to a million pulses per second – generating more X-ray pulses in just a few hours than the current laser has delivered through the course of its 12-year operational lifetime. The cutting-edge physics of the new X-ray laser – underpinned by a cryogenically cooled superconducting radiofrequency (SRF) linac – will enable the two beams from LCLS and LCLS-II to work in tandem. This, in turn, will help researchers observe rare events that happen during chemical reactions and study delicate biological molecules at the atomic scale in their natural environments, as well as potentially shed light on exotic quantum phenomena with applications in next-generation quantum computing and communications systems. 

Strategic commitment

Successful delivery of the LCLS-II linac was possible thanks to a multicentre collaborative effort involving US national and university laboratories, following the decision to pursue an SRF-based machine in 2014 through the design, assembly, test, transportation and installation of a string of 37 SRF cryomodules (most of them more than 12 m long) into the SLAC tunnel (see figures “Tunnel vision” and “Keeping cool”). All told, this non-trivial undertaking necessitated the construction of 40 1.3 GHz SRF cryomodules (five of them spares) and three 3.9 GHz cryomodules (one spare) – with delivery of approximately one cryomodule per month from February 2019 until December 2020 to allow completion of the LCLS-II linac installation on schedule by November 2021. 

This industrial-scale programme of works was shaped by a strategic commitment, early on in the LCLS-II design phase, to transfer, and ultimately iterate, the established SRF capabilities of the European XFEL project into the core technology platform used for the LCLS-II SRF cryomodules. Put simply: it would not have been possible to complete the LCLS-II project, within cost and on schedule, without the sustained cooperation of the European XFEL consortium – in particular, colleagues at DESY (Germany), CEA Saclay (France) and several other European laboratories (as well as KEK in Japan) that generously shared their experiences and know-how so that the LCLS-II collaboration could hit the ground running. 

Better together 

These days, large-scale accelerator or detector projects are very much a collective endeavour. Not only is the sprawling scope of such projects beyond a single organisation, but the risks of overspend and slippage can greatly increase with a “do-it-on-your-own” strategy. When the LCLS-II project opted for an SRF technology pathway in 2014 (to maximise laser performance and future-proofing), the logical next step was to build a broad-based coalition with other US Department of Energy (DOE) national laboratories and universities. In this case, SLAC, Fermilab, Jefferson Lab (JLab) and Cornell University contributed expertise for cryomodule production, while Argonne National Laboratory and Lawrence Berkeley National Laboratory managed delivery of the undulators and photoinjector for the project. For sure, the start-up time for LCLS-II would have increased significantly without this joint effort, extending the overall project by several years.

LCLS-II cryomodule

Each partner brought something unique to the LCLS-II collaboration. While SLAC was still a relative newcomer to SRF technologies, the lab had a management team that was familiar with building large-scale accelerators (following successful delivery of the LCLS). The priority for SLAC was therefore to scale up its small nucleus of SRF experts by recruiting experienced SRF technologists and engineers to the staff team. 

In contrast, the JLab team brought an established track-record in the production of SRF cryomodules, having built its own machine, the Continuous Electron Beam Accelerator Facility (CEBAF), as well as cryomodules for the Spallation Neutron Source (SNS) linac at Oak Ridge National Laboratory in Tennessee. Cornell, too, came with a rich history in SRF R&D – capabilities that, in turn, helped to solidify the SRF cavity preparation process for LCLS-II. 

Finally, Fermilab had, at the time, recently built two cutting-edge cryomodules of the same style as that chosen for LCLS-II. To fabricate these modules, Fermilab worked closely with the team at DESY to set up the same type of production infrastructure used on the European XFEL. From that perspective, the required tooling and fixtures were all ready to go for the LCLS-II project. While Fermilab was the “designer of record” for the SRF cryomodule, with primary responsibility for delivering a working design to meet LCLS-II requirements, the realisation of an optimised technology platform was, in large part, a team effort involving SRF experts from across the collaboration.

Challenges are inevitable when developing new facilities at the limits of known technology

Operationally, the use of two facilities to produce the SRF cryomodules – Fermilab and JLab – ensured a compressed delivery schedule and increased flexibility within the LCLS-II programme. On the downside, the dual-track production model increased infrastructure costs (with the procurement of duplicate sets of tooling) and meant additional oversight to ensure a standardised approach across both sites. Ongoing procurements were divided equally between Fermilab and JLab, with deliveries often made to each lab directly from the industry suppliers. Each facility, in turn, kept its own inventory of parts, so as to minimise interruptions to cryomodule assembly owing to any supply-chain issues (and enabling critical components to be transferred between labs as required). What’s more, the close working relationship between Fermilab and JLab kept any such interruptions to a minimum.

Collective problems, collective solutions 

While the European XFEL provided the template for the LCLS-II SRF cryomodule design, several key elements of the LCLS-II approach subsequently evolved to align with the CW operation requirements and the specifics of the SLAC tunnel. Success in tackling these technical challenges – across design, assembly, testing and transportation of the cryomodules – is testament to the strength of the LCLS-II collaboration and the collective efforts of the participating teams in the US and Europe. 

SRF cryomodule

For starters, the thermal performance specification of the SRF cavities exceeded the state-of-the-art and required development and industrialisation of the concept of nitrogen doping (a process in which SRF cavities are heat-treated in a nitrogen atmosphere to increase their cryogenic efficiency and, in turn, lower the overall operating costs of the linac). The nitrogen-doping technique was invented at Fermilab in 2012 but, prior to LCLS-II construction, had been used only in an R&D setting.

Adapatability in real-time 

The priority was clear: to transfer the nitrogen-doping capability to LCLS-II’s industry partners, so that the cavity manufacturers could perform the necessary materials processing before final helium-vessel jacketing. During this knowledge transfer, it was found that nitrogen-doped cavities are particularly sensitive to the base niobium sheet material – something the collaboration only realised once the cavity vendors were into full production. This resulted in a number of process changes for the heat treatment temperature, depending on which material supplier was used and the specific properties of the niobium sheet deployed in different production runs. JLab, for its part, held the contract for the cavities and pulled out all stops to ensure success.

At the same time, the conversion from pulsed to CW operation necessitated a faster cooldown cycle for the SRF cavities, requiring several changes to the internal piping, a larger exhaust chimney on the helium vessel, as well as the addition of two new cryogenic valves per cryomodule. Also significant is the 0.5% slope in the longitudinal floor of the existing SLAC tunnel, which dictated careful attention to liquid-helium management in the cryomodules (with a separate two-phase line and liquid-level probes at both ends of every module). 

However, the biggest setback during LCLS-II construction involved the loss of beamline vacuum during cryomodule transport. Specifically, two cryomodules had their beamlines vented and required complete disassembly and rebuilding – resulting in a five-month moratorium on shipping of completed cryomodules in the second half of 2019. It turns out that a small, what was thought to be inconsequential, change in a coupler flange resulted in the cold coupler assembly being susceptible to resonances excited by transport. The result was a bellows tear that vented the beamline. Unfortunately, initial “road-tests” with a similar, though not exactly identical, prototype cryomodule had not surfaced this behaviour. 

Shine on: from LCLS-II to LCLS-II HE

Last cryomodule

As with many accelerator projects, LCLS-II is not an end-point in itself, more an evolutionary transition within a longer term development roadmap. In fact, work is already under way on LCLS-II HE – a project that will increase the energy of the CW SRF linac from 4 to 8 GeV, enabling the photon energy range to be extended to at least 13 keV, and potentially up to 20 keV at 1 MHz repetition rates. 

To ensure continuity of production for LCLS-II HE, 25 next-generation cryomodules are in the works, with even higher performance specifications versus their LCLS-II counterparts, while upgrades to the source and beam transport are also being finalised. 

In addition to LCLS-II HE, other SRF disciplines will benefit from the R&D and technological innovation that has come out of the LCLS-II construction programme. SRF technologies are constantly evolving and advancing the state-of-the-art, whether that’s in single-cavity cryogen-free systems, additional FEL CW upgrades to existing machines, or the building blocks that will underpin enormous new machines like the proposed International Linear Collider. 

Such challenges are inevitable when developing new facilities at the limits of known technology. In the end, the problem was successfully addressed using the diverse talents of the collaboration to brainstorm solutions, with the available access ports allowing an elastomer wedge to be inserted to secure the vulnerable section. A key take-away here is the need for future projects to perform thorough transport analysis, verify the transport loads using mock-ups or dummy devices, and install adequate instrumentation to ensure granular data analysis before long-distance transport of mission-critical components. 

Upon completion of the assembly phase, all LCLS-II cryo-modules were subsequently tested at either Fermilab or JLab, with one module tested at both locations to ensure reproducibility and consistency of results. For high Q0 performance in nitrogen-doped cavities, cooldown flow rates of at least 30 g/s of liquid helium were found to give the best results, helping to expel magnetic flux that could otherwise be trapped in the cavity. 

Overall, cryomodule performance on the test stands exceeded specifications, with an average energy gain per cryomodule of 158 MV (versus specification of 128 MV) and average Q0 of 3 × 1010 (versus specification of 2.7 × 1010). Looking ahead, attention is already shifting to the real-world cryomodule performance in the SLAC tunnel – something that will be measured for the first time in 2022.

Transferable lessons

For all members of the collaboration working on the LCLS-II cryomodules, this challenging project holds many lessons. Most important is the nature of collaboration itself, building a strong team and using that strength to address problems in real-time as they arise. The mantra “we are all in this together” should be front-and-centre for any multi-institutional scientific endeavour – as it was in this case. With all parties making their best efforts, the goal should be to utilise the combined strengths of the collaboration to mitigate challenges. Solutions need to be thought of in a more global sense, since the best answer might mean another collaborator taking more onto their plate. Collaboration implies true partnership and a working model very different to a transactional customer–vendor relationship.

Collaboration implies true partnership and a working model very different to a transactional relationship

From a planning perspective, it’s vital to ensure that the initial project cost and schedule are consistent with the technical challenges and preparedness of the infrastructure. Prototypes and pre-series production runs reduce risk and cost in the long term and should be part of the plan, but there must be sufficient time for data analysis and changes to be made after a prototype run in order for it to be useful. Time spent on detailed technical reviews is also time well spent. New designs of complex components need detailed oversight and review, and should be controlled by a team, rather than a single individual, so that sign-off on any detailed design changes are made by an informed collective. 

Planning ahead

Work planning and control is another essential element for success and safety. This idea needs to be built into the “manufacturing system”, including into the cost and schedule, and be part of each individual’s daily checklist. No one disagrees with this concept, but good intentions on their own will not suffice. As such, required safety documentation should be clear and unambiguous, and be reviewed by people with relevant expertise. Production data and documentation need to be collected, made easily available to the entire project team, and analysed regularly for trends, both positive and negative. 

JLab cryomodule

Supply chain, of course, is critical in any production environment – and LCLS-II is no exception. When possible, it is best to have parts procured, inspected, accepted and on-the-shelf before production begins, thereby eliminating possible workflow delays. Pre-stocking also allows adequate time to recycle and replace parts that do not meet project specifications. Also worth noting is that it’s often the smaller components – such as bellows, feedthroughs and copper-plated elements – that drive workflow slowdowns. A key insight from LCLS-II is to place purchase orders early, stay on top of vendor deliveries, and perform parts inspections as soon as possible post-delivery. Projects also benefit from having clearly articulated pass/fail criteria and established procedures for handling non-conformance – all of which alleviates the need to make critical go/no-go acceptance decisions in the face of schedule pressures.

Finally, it’s worth highlighting the broader impact – both personal and professional – to individual team members participating on a big-science collaboration like LCLS-II. At the end of the build, what remained after designs were completed, problems solved, production rates met, and cryomodules delivered and installed, were the friendships that had been nurtured over several years. The collaboration amongst partners, both formal and informal, who truly cared about the project’s success, and had each other’s backs when there were issues arising: these are the things that solidified the mutual respect, the camaraderie and, in the end, made LCLS-II such a rewarding project.

The post ‘First light’ beckons as LCLS-II gears up appeared first on CERN Courier.

]]>
Feature An ambitious upgrade of SLAC's X-ray free-electron-laser facility – the Linac Coherent Light Source – is nearing completion. https://cerncourier.com/wp-content/uploads/2021/09/CCUSASupp21_LCLS_cool.jpg
BASE demonstrates two-trap cooling https://cerncourier.com/a/base-demonstrates-two-trap-cooling/ Wed, 25 Aug 2021 14:58:58 +0000 https://preview-courier.web.cern.ch/?p=93967 As reported today in Nature, the technique promises to reduce the time needed to cool antiprotons from hours to seconds.

The post BASE demonstrates two-trap cooling appeared first on CERN Courier.

]]>
In a significant technological advance for antimatter research, the BASE (Baryon Antibaryon Symmetry Experiment) collaboration has used laser-cooled ions to cool a proton more quickly and to lower temperatures than is possible using existing methods. The new technique, which introduces a separate Penning trap, promises to reduce the time needed to cool protons and antiprotons to sub-Kelvin temperatures from hours to seconds, potentially increasing the sample sizes available for precision matter-antimatter comparisons by orders of magnitude. As reported today in Nature, the collaboration’s test setup at the University of Mainz also reached temperatures approximately 10 times lower than the limit of the established resistive-cooling technique.

“The factor 10 reduction in temperature which has been achieved in our paper is just a first step,” says BASE deputy spokesperson Christian Smorra of the University of Mainz and RIKEN. “With optimised procedures we should be able to reach particle temperatures of order 20 mK to 50 mK, ideally in cooling times of order 10 seconds. Previous methods allowed us to reach 100 mK in 10 hours.”

The new setup consists of two Penning traps separated by 9 cm. One trap contains a single proton. The other contains a cloud of beryllium ions that are laser-cooled using conventional techniques. The proton is cooled as its kinetic energy is transferred through a superconducting resonant electric circuit into the cooler beryllium trap.

Two-trap sympathetic cooling

The proton and the beryllium ions can be thought of as mechanical oscillators within the magnetic and electric fields of the Penning traps, explains lead author Matthew Bohman of the Max Planck Institute for Nuclear Physics in Heidelberg and RIKEN. “The resonant electric circuit acts like a spring, coupling the oscillations — the oscillation of the proton is damped by its coupling to the conventionally cooled cloud of beryllium ions.”

The collaboration’s unique two-trap sympathetic-cooling technique was first proposed in 1990 by Daniel Heinzen and David Wineland. Wineland went on to share the 2012 Nobel prize in physics for related work in manipulating individual particles while preserving quantum information. The use of a resonant electric circuit to couple the two Penning traps is an innovation by the BASE collaboration which speeds up the rate of energy exchange relative to Heinzen and Wineland’s proposal from minutes to seconds. The technique is useful for protons, but game-changing for antiprotons.

Antiproton prospects

A two-trap setup is attractive for antimatter because a single Penning trap cannot easily accommodate particles with opposite charges, and laser-cooled ions are nearly always positively charged, with electrons stripped away. BASE previously cooled antiprotons by coupling them to a superconducting resonator at around 4 K, and painstakingly selecting the lowest energy antiprotons in the ensemble over many hours. 

Our technique shows that you can apply the laser-physics toolkit to exotic particles

Matthew Bohman

“With two-trap sympathetic cooling by laser-cooled beryllium ions, the limiting temperature rapidly approaches that of the ions, in the milli-Kelvin range,” explains Bohman. “Our technique shows that you can apply the laser-physics toolkit to exotic particles like antiprotons: a good antiproton trap looks pretty different from a good laser-cooled ion trap, but if you’re able to connect them by a wire or a coil you can get the best of both worlds.”

The BASE collaboration has already measured the magnetic moment of the antiproton with a record fractional precision of 1.5 parts per billion at CERN’s antimatter factory. When deployed there, two-trap sympathetic cooling has the potential to improve the precision of the measurement by at least a factor of 20. Any statistically significant difference relative to the magnetic moment of the proton would violate CPT symmetry and signal a dramatic break with the Standard Model.

“Our vision is to continuously improve the precision of our matter-antimatter comparisons to develop a better understanding of the cosmological matter-antimatter asymmetry,” says BASE spokesperson Stefan Ulmer of RIKEN. “The newly developed technique will become a key method in these experiments, which aim at measurements of fundamental antimatter constants at the sub-parts-per-trillion level. Further developments in progress at the BASE-logic experiment in Hanover will even allow the implementation of quantum-logic metrology methods to read-out the antiproton’s spin state.”

The post BASE demonstrates two-trap cooling appeared first on CERN Courier.

]]>
News As reported today in Nature, the technique promises to reduce the time needed to cool antiprotons from hours to seconds. https://cerncourier.com/wp-content/uploads/2021/08/BASE-two-trap-cooling.jpg
Particle Detectors – Fundamentals and Applications https://cerncourier.com/a/particle-detectors-fundamentals-and-applications/ Sat, 10 Jul 2021 09:12:43 +0000 https://preview-courier.web.cern.ch/?p=92981 Kolanoski and Wermes' new book is a reference for lectures on experimental methods for postgraduate students, writes our reviewer.

The post Particle Detectors – Fundamentals and Applications appeared first on CERN Courier.

]]>
Particle Detectors – Fundamentals and Applications

Throughout the history of nuclear, particle and astroparticle physics, novel detector concepts have paved the way to new insights and new particles, and will continue to do so in the future. To help train the next generation of innovators, noted experimental particle physicists Hermann Kolanoski (Humboldt University Berlin and DESY) and Norbert Wermes (University of Bonn) have written a comprehensive textbook on particle detectors. The authors use their broad experience in collider and underground particle-physics experiments, astroparticle physics experiments and medical-imaging applications to confidently cover the spectrum of experimental methods in impressive detail.

Particle Detectors – Fundamentals and Applications combines in a single volume the syllabus also found in two well-known textbooks covering slightly different aspects of detectors: Techniques for Nuclear and Particle Physics Experiments by W R Leo and Detectors for Particle Radiation by Konrad Kleinknecht. Kolanoski and Wermes’ book supersedes them both by being more up-to-date and comprehensive. It is more detailed than Particle Detectors by Claus Grupen and Boris Shwartz – another excellent and recently published textbook with a similar scope – and will probably attract a slightly more advanced population of physics students and researchers. This new text promises to become a particle-physics analogue of the legendary experimental-nuclear-physics textbook Radiation Detection and Measurement by Glenn Knoll.

The book begins with a comprehensive warm-up chapter on the interaction of charged particles and photons with matter, going well beyond a typical textbook level. This is followed by a very interesting discussion of the transport of charge carriers in media in magnetic and electric fields, and – a welcome novelty – signal formation, using the method of “weighting fields”. The main body of the book is devoted first to gaseous, semiconductor, Cherenkov and transition-radiation detectors, and then to detector systems for tracking, particle identification and calorimetry, and the detection of cosmic rays, neutrinos and exotic matter. Final chapters on electronics readout, triggering and data acquisition complete the picture. 

Particle Detectors – Fundamentals and Applications is best considered a reference for lectures on experimental methods in particle and nuclear physics for postgraduate-level students. The book is easy to read, and conceptual discussions are well supported by numerous examples, plots and illustrations of excellent quality. Kolanoski and Wermes have undoubtedly written a gem of a book, with value for any experimental particle physicist, be they a master’s student, PhD student or accomplished researcher looking for detector details outside of their expertise.

The post Particle Detectors – Fundamentals and Applications appeared first on CERN Courier.

]]>
Review Kolanoski and Wermes' new book is a reference for lectures on experimental methods for postgraduate students, writes our reviewer. https://cerncourier.com/wp-content/uploads/2021/06/CCJulAug21_REV_Particle_feature.jpg
‘A CERN for climate change’ https://cerncourier.com/a/a-cern-for-climate-change/ Fri, 02 Jul 2021 07:44:24 +0000 https://preview-courier.web.cern.ch/?p=92890 An exascale computing facility modelled on the organisation of CERN would enable a step-change in quantifying climate change, argue Tim Palmer and Bjorn Stevens. 

The post ‘A CERN for climate change’ appeared first on CERN Courier.

]]>
Climate models

In the early 1950s, particle accelerators were national-level activities. It soon became obvious that to advance the field further demanded machines beyond the capabilities of single countries. CERN marked a phase transition in this respect, enabling physicists to cooperate around the development of one big facility. Climate science stands to similarly benefit from a change in its topology.

Modern climate models were developed in the 1960s, but there weren’t any clear applications or policy objectives at that time. Today we need hard numbers about how the climate is changing, and an ability to seamlessly link these changes to applications – a planetary information system for assessing hazards, planning food security, aiding global commerce, guiding infrastructural investments, and much more. National centres for climate modelling exist in many countries. But we need a centre “on steroids”: a dedicated exascale computing facility organised on a similar basis to CERN that would allow the necessary leap in realism.

Quantifying climate

To be computationally manageable, existing climate models solve equations for quantities that are first aggregated over large spatial and temporal scales. This blurs their relationship to physical laws, to phenomena we can measure, and to the impacts of a changing climate on infrastructure. Clouds, for example, are creatures of circulation, particularly vertical air currents. Existing models attempt to infer what these air currents would be given information about much larger scale 2D motion fields. There is a necessary degree of abstraction, which leads to less useful results. We don’t know if air is going up or down an individual mountain, for instance, because we don’t have individual mountains in the model, at best mountain ranges. 

Tim Palmer

In addition to more physical models, we also need a much better quantification of model uncertainty. At present this is estimated by comparing solutions across many low-resolution models, or by perturbing parameters of a given low-resolution model. The particle-physics analogy might be that everyone runs their own low-energy accelerators hoping that coordinated experiments will provide high-energy insights. Concentrating efforts on a few high-resolution climate models, where uncertainty is encoded through stochastic mathematics, is a high-energy effort. It would result in better and more useful models, and open the door to cooperative efforts to systematically explore the structural stability of the climate system and its implications for future climate projections.

Working out climate-science’s version of the Standard Model thus provides the intellectual underpinnings for a “CERN for climate change”. One can and should argue about the exact form such a centre should take, whether it be a single facility or a federation of campuses, and on the relative weight it gives to particular questions. What is important is that it creates a framework for European climate, computer and computational scientists to cooperate, also with application communities, in ways that deliver the maximum benefit for society.

Building momentum

A number of us have been arguing for such a facility for more than a decade. The idea seems to be catching on, less for the eloquence of our arguments, more for the promise of exascale computing. A facility to accelerate climate research in developing and developed countries alike has emerged as a core element of one of 12 briefing documents prepared by the Royal Society in advance of the United Nations Climate Change Conference, COP26, in November. This briefing flanks the European Union’s “Destination Earth” project, which is part of its Green Deal programme – a €1 billion effort over 10 years that envisions the development of improved high-resolution models with better quantified uncertainty. If not anchored in a sustainable organisational concept, however, this risks throwing money to the wind.

Bjorn Stevens

Giving a concrete form to such a facility still faces internal hurdles, possibly similar to those faced by CERN in its early days. For example, there are concerns that it will take away funding from existing centres. We believe, and CERN’s own experience shows, that the opposite is more likely true. A “CERN for climate change” would advance the frontiers of the science, freeing researchers to turn their attention to new questions, rather than maintaining old models, and provide an engine for European innovation that extends far beyond climate change.

The post ‘A CERN for climate change’ appeared first on CERN Courier.

]]>
Opinion An exascale computing facility modelled on the organisation of CERN would enable a step-change in quantifying climate change, argue Tim Palmer and Bjorn Stevens.  https://cerncourier.com/wp-content/uploads/2021/06/CCJulAug21_VIEW_blanc.jpg
CERN’s impact on medical technology https://cerncourier.com/a/cerns-impact-on-medical-technology/ Thu, 24 Jun 2021 12:33:04 +0000 https://preview-courier.web.cern.ch/?p=92874 Frontier instruments like the LHC and its detectors not only push back the boundaries of our knowledge, but also catalyse innovative technology for medical applications, writes Manuela Cirilli.

The post CERN’s impact on medical technology appeared first on CERN Courier.

]]>
Hadron-therapy beam

Today, the tools of experimental particle physics are ubiquitous in hospitals and biomedical research. Particle beams damage cancer cells; high-performance computing infrastructures accelerate drug discoveries; computer simulations of how particles interact with matter are used to model the effects of radiation on biological tissues; and a diverse range of particle-physics-inspired detectors, from wire chambers to scintillating crystals to pixel detectors, all find new vocations imaging the human body.

CERN has actively pursued medical applications of its technologies as far back as the 1970s. At that time, knowledge transfer happened – mostly serendipitously – through the initiative of individual researchers. An eminent example is Georges Charpak, a detector physicist of outstanding creativity who invented the Nobel-prize-winning multiwire proportional chamber (MWPC) at CERN in 1968. The MWPC’s ability to record millions of particle tracks per second opened a new era for particle physics (CERN Courier December 1992 p1). But Charpak strived to ensure that the technology could also be used outside the field – for example in medical imaging, where its sensitivity promised to reduce radiation doses during imaging procedures – and in 1989 he founded a company that developed an imaging technology for radiography which is currently deployed as an orthopaedic application. Following his example, CERN has continued to build a culture of entrepreneurship ever since.

Triangulating tumours

Since as far back as the 1950s, a stand-out application for particle-physics detector technology has been positron-emission tomography (PET) – a “functional” technique that images changes in the metabolic process rather than anatomy. The patient is injected with a compound carrying a positron-emitting isotope, which accumulates in areas of the body with high metabolic activity (the uptake of glucose, for example, could be used to identify a malignant tumour). Pairs of back-to-back 511 keV photons are detected when a positron annihilates with an electron in the surrounding matter, allowing the tumour to be triangulated.

Colour X-ray of a mouse

Pioneering developments in PET instrumentation took place in the 1970s. While most scanners were based on scintillating crystals, the work done with wire chambers at the University of California at Berkeley inspired CERN physicists David Townsend and Alan Jeavons to use high-density avalanche chambers (HIDACs) – Charpak’s detector plus a photon-conversion layer. In 1977, with the participation of CERN radiobiologist Marilena Streit-Bianchi, this technology was used to create some of the first PET images, most famously of a mouse. The HIDAC detector later contributed significantly to 3D PET image reconstruction, while a prototype partial-ring tomograph developed at CERN was a forerunner for combined PET and computed tomography (CT) scanners. Townsend went on to work at the Cantonal Hospital in Geneva and then in the US, where his group helped develop the first PET/CT scanner, which combines functional and anatomic imaging.

Crystal clear

In the onion-like configuration of a collider detector, an electromagnetic calorimeter often surrounds a descendant of Charpak’s wire chambers, causing photons and electrons to cascade and measuring their energy. In 1991, to tackle the challenges posed by future detectors at the LHC, the Crystal Clear collaboration was formed to study innovative scintillating crystals suitable for electromagnetic calorimetry. Since its early years, Crystal Clear also sought to apply the technology to other fields, including healthcare. Several breast, pancreas, prostate and animal-dedicated PET scanner prototypes have since been developed, and the collaboration continues to push the limits of coincidence-time resolution for time-of-flight (TOF) PET. 

In TOF–PET, the difference between the arrival times of the two back-to-back photons is recorded, allowing the location of the annihilation along the axis connecting the detection points to be pinned down. Better time resolution therefore improves image quality and reduces the acquisition time and radiation dose to the patient. Crystal Clear continues this work to this day through the development of innovative scintillating-detector concepts, including at a state-of-the-art laboratory at CERN.

The dual aims of the collaboration have led to cross-fertilisation, whereby the work done for high-energy physics spills over to medical imaging, and vice versa. For example, the avalanche photodiodes developed for the CMS electromagnetic calorimeter were adapted for the ClearPEM breast-imaging prototype, and technology developed for detecting pancreatic and prostate cancer (EndoTOFPET-US) inspired the “barrel timing layer” of crystals that will instrument the central portion of the CMS detector during LHC Run 3.

Pixel perfect

In the same 30-year period, the family of Medipix and Timepix read-out chips has arguably made an even bigger impact on med-tech and other application fields, becoming one of CERN’s most successful technology-transfer cases. Developed with the support of four successive Medipix collaborations, involving a total of 37 research institutes, the technology is inspired by the high-resolution hybrid pixel detectors initially developed to address the challenges of particle tracking in the innermost layers of the LHC experiments. In hybrid detectors, the sensor array and the read-out chip are manufactured independently and later coupled by a bump-bonding process. This means that a variety of sensors can be connected to the Medipix and Timepix chips, according to the needs of the end user.

Visualisation of energy deposition

The first Medipix chip produced in the 1990s by the Medipix1 collaboration was based on the front-end architecture of the Omega3 chip used by the half-million-pixel tracker of the WA97 experiment, which studied strangeness production in lead–ion collisions. The upgraded Medipix1 chip also included a counter per pixel. This demonstrated that the chips could work like a digital camera, providing high-resolution, high-contrast and noise-hit-free images, making them uniquely suitable for medical applications. Medipix2 improved spatial resolution and produced a modified version called Timepix that offers time or amplitude measurements in addition to hit counting. Medipix3 and Timepix3 then allowed the energy of each individual photon to be measured – Medipix3 allocates incoming hits to energy bins in each pixel, providing colour X-ray images, while Timepix3 times hits with a precision of 1.6 ns, and sends the full hit data – coordinate, amplitude and time – off chip. Most recently, the Medipix4 collaboration, which was launched in 2016, is designing chips that can seamlessly cover large areas, and is developing new read-out architectures, thanks to the possibility of tiling the chips on all four sides.

Medipix and Timepix chips find applications in widely varied fields, from medical imaging to cultural heritage, space dosimetry, materials analysis and education. The industrial partners and licence holders commercialising the technology range from established enterprises to start-up companies. In the medical field, the technology has been applied to X-ray CT prototype systems for digital mammography, CT imagers for mammography, and beta- and gamma-autoradiography of biological samples. In 2018 the first 3D colour X-ray images of human extremities were taken by a scanner developed by MARS Bioimaging Ltd, using the Medipix3 technology. By analysing the spectrum recorded in each pixel, the scanner can distinguish multiple materials in a single scan, opening up a new dimension in medical X-ray imaging: with this chip, images are no longer black and white, but in colour (see “Colour X-ray” image).

Although the primary aim of the Timepix3 chip was applications outside of particle physics, its development also led directly to new solutions in high-energy physics, such as the VELOpix chip for the ongoing LHCb upgrade, which permits data-driven trigger-free operation for the first time in a pixel vertex detector in a high-rate experiment. 

Dosimetry

CERN teams are also exploring the potential uses of Medipix technology in dosimetry. In 2019, for example, Timepix3 was employed to determine the exposure of medical personnel to ionising radiation in an interventional radiology theatre at Christchurch Hospital in New Zealand. The chip was able to map the radiation fluence and energy spectrum of the scattered photon field that reaches the practitioners, and can also provide information about which parts of the body are most exposed to radiation.

Meanwhile, “GEMPix” detectors are being evaluated for use in quality assurance in hadron therapy. GEMPix couples gas electron multipliers (GEMs) – a type of gaseous ionisation detector developed at CERN – with the Medipix integrated circuit as readout to provide a hybrid device capable of detecting all types of radiation with a high spatial resolution. Following initial results from tests on a carbon-ion beam performed at the National Centre for Oncological Hadrontherapy (CNAO) in Pavia, Italy, a large-area GEMPix detector with an innovative optical read-out is now being developed at CERN in collaboration with the Holst Centre in the Netherlands. A version of the GEMPix called GEMTEQ is also currently under development at CERN for use in “microdosimetry”, which studies the temporal and spatial distributions of absorbed energy in biological matter to improve the safety and effectiveness of cancer treatments.

Knowledge transfer at CERN

GEMPix detectors

As a publicly funded laboratory, CERN has a remit, in addition to its core mission to perform fundamental research in particle physics, to expand the opportunities for its technology and expertise to deliver tangible benefits to society. The CERN Knowledge Transfer group strives to maximise the impact of CERN technologies and know-how on society in many ways, including through the establishment of partnerships with clinical, industrial and academic actors, support to budding entrepreneurs and seed funding to CERN personnel.

Supporting the knowledge-transfer process from particle physics to medical research and the med-tech industry is a promising avenue to boost healthcare innovation and provide solutions to present and future health challenges. CERN has provided a framework for the application of its technologies to the medical domain through a dedicated strategy document approved by its Council in June 2017. CERN will continue its efforts to maximise the impact of the laboratory’s know-how and technologies on the medical sector.

Two further dosimetry applications illustrate how technologies developed for CERN’s needs have expanded into commercial medical applications. The B-RAD, a hand-held radiation survey meter designed to operate in strong magnetic fields, was developed by CERN in collaboration with the Polytechnic of Milan and is now available off-the-shelf from an Italian company. Originally conceived for radiation surveys around the LHC experiments and inside ATLAS with the magnetic field on, it has found applications in several other tasks, such as radiation measurements on permanent magnets, radiation surveys at PET-MRI scanners and at MRI-guided radiation therapy linacs. Meanwhile, the radon dose monitor (RaDoM) tackles exposure to radon, a natural radioactive gas that is the second leading cause of lung cancer after smoking. The RaDoM device directly estimates the dose by reproducing the energy deposition inside the lung instead of deriving the dose from a measurement of radon concentration in air; CERN also developed a cloud-based service to collect and analyse the data, to control the measurements and to drive mitigation measures based on real time data. The technology is licensed to the CERN spin-off BAQ. 

Cancer treatments

Having surveyed the medical applications of particle detectors, we turn to the technology driving the beams themselves. Radiotherapy is a mainstay of cancer treatment, using ionising radiation to damage the DNA of cancer cells. In most cases, a particle accelerator is used to generate a therapeutic beam. Conventional radiation therapy uses X-rays generated by a linac, and is widely available at relatively low cost.

Medipix and Timepix read-out chips have become one of CERN’s most successful technology-transfer cases

Radiotherapy with protons was first proposed by Fermilab’s founding director Robert Wilson in 1946 while he was at Berkeley, and interest in the use of heavier ions such as carbon arose soon after. While X-rays lose energy roughly exponentially as they penetrate tissue, protons and other ions deposit almost all of their energy in a sharp “Bragg” peak at the very end of their path, enabling the dose to be delivered on the tumour target, while sparing the surrounding healthy tissues. Carbon ions have the additional advantage of a higher radiobiological effectiveness, and can control tumours that are radio-resistant to X-rays and protons. Widespread adoption of hadron therapy is, however, limited by the cost and complexity of the required infrastructures, and by the need for more pre-clinical and clinical studies.

PIMMS and NIMMS

Between 1996 and 2000, under the impulsion of Ugo Amaldi, Meinhard Regler and Phil Bryant, CERN hosted the Proton-Ion Medical Machine Study (PIMMS). PIMMS produced and made publicly available an optimised design for a cancer-therapy synchrotron capable of using both protons and carbon ions. After further enhancement by Amaldi’s TERA foundation, and with seminal contributions from Italian research organisation INFN, the PIMMS concept evolved into the accelerator at the heart of the CNAO hadron therapy centre in Pavia. The MedAustron centre in Wiener Neustadt, Austria, was then based on the CNAO design. CERN continues to collaborate with CNAO and MedAustron by sharing its expertise in accelerator and magnet technologies. 

In the 2010s, CERN teams put to use the experience gained in the construction of Linac 4, which became the source of proton beams for the LHC in 2020, and developed an extremely compact high-frequency radio-frequency quadrupole (RFQ) to be used as injector for a new generation of high-frequency, compact linear accelerators for proton therapy. The RFQ accelerates the proton beam to 5 MeV after only 2 m, and operates at 750 MHz – almost double the frequency of conventional RFQs. A major advantage of using linacs for proton therapy is the possibility of changing the energy of the beam, and hence the depth of treatment in the body, from pulse to pulse by switching off some of the accelerating units. The RFQ technology was licensed to the CERN spin-off ADAM, now part of AVO (Advanced Oncotherapy), and is being used as an injector for a breakthrough linear proton therapy machine at the company’s UK assembly and testing centre at STFC’s Daresbury Laboratory. 

Simulation of a dendritic arbour

In 2019 CERN launched the Next Ion Medical Machine Study (NIMMS) to develop cutting-edge accelerator technologies for a new generation of compact and cost-effective ion-therapy facilities. The goal is to propel the use of ion therapy, given that proton installations are already commercially available and that only four ion centres exist in Europe, all based on bespoke solutions. 

NIMMS is organised along four different lines of activities. The first aims to reduce the footprint of facilities by developing new superconducting magnet designs with large apertures and curvatures, and for pulsed operation. The second is the design of a compact linear accelerator optimised for installation in hospitals, which includes an RFQ based on the design of the proton therapy RFQ, and a novel source for fully-stripped carbon ions. The third concerns two innovative gantry designs, with the aim of reducing the size, weight and complexity of the massive magnetic structures that allow the beam to reach the patient from different angles: the SIGRUM lightweight rotational gantry originally proposed by TERA, and the GaToroid gantry invented at CERN which eliminates the need to mechanically rotate the structure by using a toroidal magnet (see figure “GaToroid”). Finally, new high-current synchrotron designs will be developed to reduce the cost and footprint of facilities while reducing the treatment time compared to present European ion-therapy centres: these will include a superconducting and a room-temperature option, and advanced features such as multi-turn injection for 1010 particles per pulse, fast and slow extraction, and multiple ion operation. Through NIMMS, CERN is contributing to the efforts of a flourishing European community, and a number of collaborations have been already established.

Another recent example of frontier radiotherapy techniques is the collaboration with Switzerland’s Lausanne University Hospital (CHUV) to build a new cancer therapy facility that would deliver high doses of radiation from very-high-energy electrons (VHEE) in milliseconds instead of minutes. The goal here is to exploit the so-called FLASH effect, wherein radiation doses administered over short time periods appear to damage tumours more than healthy tissue, potentially minimising harmful side-effects. This pioneering installation will be based on the high-gradient accelerator technology developed for the proposed CLIC electron–positron collider. Various research teams have been performing their biomedical research related to VHEE and FLASH at the CERN Linear Electron Accelerator for Research (CLEAR), one of the few facilities available for characterising VHEE beams.

Radioisotopes

CERN’s accelerator technology is also deployed in a completely different way to produce innovative radioisotopes for medical research. In nuclear medicine, radioisotopes are used both for internal radiotherapy and for diagnosis of cancer and other diseases, and progress has always been connected to the availability of novel radioisotopes. Here, CERN has capitalised on the experience of its ISOLDE facility, which during the past 30 years has the proton beam from the CERN PS Booster to produce 1300 different isotopes from 73 chemical elements for research ranging from nuclear physics to the life sciences. A new facility, called ISOLDE-MEDICIS, is entirely dedicated to the production of unconventional radioisotopes with the right properties to enhance the precision of both patient imaging and treatment. In operation since late 2017, MEDICIS will expand the range of radioisotopes available for medical research – some of which can be produced only at CERN – and send them to partner hospitals and research centres for further studies. During its 2019 and 2020 harvesting campaigns, for example, MEDICIS demonstrated the capability of purifying isotopes such as 169Er or 153Sm to new purity grades, making them suitable for innovative treatments such as targeted radioimmunotherapy.

Data handling and simulations

The expertise of particle physicists in data handling and simulation tools are also increasingly finding applications in the biomedical field. The FLUKA and Geant4 simulation toolkits, for example, are being used in several applications, from detector modelling to treatment planning. Recently, CERN contributed its know-how in large-scale computing to the BioDynaMo collaboration, initiated by CERN openlab together with Newcastle University, which initially aimed to provide a standardised, high-performance and open-source platform to support complex biological simulations (see figure “Computational neuroscience”). By hiding its computational complexity, BioDynaMo allows researchers to easily create, run and visualise 3D agent-based simulations. It is already used by academia and industry to simulate cancer growth, accelerate drug discoveries and simulate how the SARS-CoV-2 virus spreads through the population, among other applications, and is now being extended beyond biological simulations to visualise the collective behaviour of groups in society. 

The expertise of particle physicists in data handling and simulation tools are increasingly finding applications in the biomedical field

Many more projects related to medical applications are in their initial phases. The breadth of knowledge and skills available at CERN was also evident during the COVID-19 pandemic when the laboratory contributed to the efforts of the particle-physics community in fields ranging from innovative ventilators to masks and shields, from data management tools to open-data repositories, and from a platform to model the concentration of viruses in enclosed spaces to epidemiologic studies and proximity-sensing devices, such as those developed by Terabee.

Fundamental research has a priceless goal: knowledge for the sake of knowledge. The theories of relativity and quantum mechanics were considered abstract and esoteric when they were developed; a century later, we owe to them the remarkable precision of GPS systems and the transistors that are the foundation of the electronics-based world we live in. Particle-physics research acts as a trailblazer for disruptive technologies in the fields of accelerators, detectors and computing. Even though their impact is often difficult to track as it is indirect and diffused over time, these technologies have already greatly contributed to the advances of modern medicine and will continue to do so

The post CERN’s impact on medical technology appeared first on CERN Courier.

]]>
Feature Frontier instruments like the LHC and its detectors not only push back the boundaries of our knowledge, but also catalyse innovative technology for medical applications, writes Manuela Cirilli. https://cerncourier.com/wp-content/uploads/2021/06/CCJulAug21_MEDTECH_feature.jpg
From CERN to the environment https://cerncourier.com/a/from-cern-to-the-environment/ Fri, 04 Jun 2021 11:09:21 +0000 https://preview-courier.web.cern.ch/?p=92522 A recent CERN Alumni Network event highlighted how skills developed in high-energy physics can be transferred to careers in the environmental industry.

The post From CERN to the environment appeared first on CERN Courier.

]]>
Daphne Technology

CERN technologies and personnel make it a hub for so much more than exploring the fundamental laws of the universe. In an event organised by the CERN Alumni Relations team on 30 April, five CERN alumni who now work in the environmental industry discussed how their high-energy physics training helped them to get to where they are today.

One panellist, Zofia Rudjor, used to work on the ATLAS trigger system and the measurement of the Higgs-boson decays to tau leptons. Having spent 10 years at CERN, and with the discovery of the Higgs still fresh in the memory, she now works as a data scientist for the Norwegian Institute for Water Research (NIVA). “For my current role, a lot of the skills that I acquired at CERN, from solving complex problems to working with real-time data streams, turned out to be very key and useful,” she said at the virtual April event. Similar sentiments were shared by fellow panelist Manel Sanmarti, a former cryogenic engineer who is now the co-founder of Bamboo Energy Platform: “CERN is kind of the backbone of my career – it’s really excellent. I would say it’s the ‘Champions League’ of technology!”

However, much learning and preparation is also required to transition from particle physics to the environment. Charlie Cook began his career as an engineer at CERN and is now the founder of Rightcharge, a company which helps electric car drivers reduce the cost of charging and to use cleaner energy sources. Before taking the plunge into the environmental industry, he first completed a course at Imperial College Business School on climate-change management and finance, which helped him “learn the lingo” in the finance world. A stint at Octopus Electric Vehicles was followed by driving a domestic vehicle-to-grid demonstration project called Powerloop which launched at the beginning of 2018. “Sometimes it’s too easy to start talking in abstract terms about sustainability, but, to really understand things I like to see the numbers behind everything,” he said.

Everything that is happening in the environmental field today is all because of policymakers

Mario Michan, CEO of Daphne Technology (a company focused on enabling industries to decarbonise), and a former investigator of antihydrogen at CERN’s Antiproton Decelerator, also stressed the importance of being familiar with how the sector works, pointing out the large role that policymakers take in the field: “Everything that is happening in the environmental field today is all because of policymakers,” he remarked.

Another particle physicist who made the change is Giorgio Cortiana, who now works at E.ON’s global advanced analytics and artificial intelligence leading several data-science projects. His scientific background in complex physics data analysis, statistics, machine learning and object-oriented programming is ideal for extracting meaningful insights from large datasets, and for coping with everyday problems that need quick and effective solutions, he explained, noting the different mentality from academia. “At CERN you have the luxury to really focus on your research, down to the tiny details — now, I have to be a bit more pragmatic,” he said. “Here [at E.ON] we are instead looking to try and make an impact as soon as we can.

Leaving the field
The decision to leave the familiar surroundings of high-energy physics requires perseverance, stressed Rudjor, stating that it is important to pick up the phone to find out what type of position is really being offered. Other panelists also noted that it is vital to spend some time to look at what skills you can bring for a specific posting. “I think there are many workplaces which don’t really know how to recruit people with our skills – they would like the people, but they typically don’t open positions because they don’t know exactly how to specify the job.”

The CERN Alumni Network’s “Moving Out of Academia” events provide a rich source of candid advice for those seeking to make the change, while also demonstrating the impact of high-energy physics in broader society. The latest environment-industry events follow others dedicated to careers in finance, industrial engineering, big data, entrepreneurship and medical technologies. More are in store, explains head of CERN Alumni Relations, Rachel Bray. “One of our goals is to support those in their early careers – if and when they decide to leave academia for another sector. In addition to the Moving out of Academia events, we have recently launched a new series which brings together early-career scientists and the companies seeking the talents and skills developed at CERN.”

The post From CERN to the environment appeared first on CERN Courier.

]]>
Careers A recent CERN Alumni Network event highlighted how skills developed in high-energy physics can be transferred to careers in the environmental industry. https://cerncourier.com/wp-content/uploads/2021/06/Daphne_Technology_crp-2.jpg
The CERN Quantum Technology Initiative https://cerncourier.com/a/the-cern-quantum-technology-initiative/ Thu, 11 Mar 2021 13:21:55 +0000 https://preview-courier.web.cern.ch/?p=91842 This webinar will introduce the new CERN Quantum Technology Initiative, give an overview of the laboratory’s R&D activities and plans in this field, and give examples of the potential impact on research.

The post The CERN Quantum Technology Initiative appeared first on CERN Courier.

]]>
By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

Quantum technologies have the potential to revolutionise science and society, but are still in their infancy. In recent years, the growing importance and the potential impact of quantum technology development has been highlighted by increasing investments in R&D worldwide in both academia and industry.

Cutting-edge research in quantum systems has been performed at CERN for many years to investigate the many open questions in quantum mechanics and particle physics. However, only recently, the different ongoing activities in quantum computing, sensing communications and theory have been brought under a common strategy to assess the potential impact on future CERN experiments.

This webinar, presented by Alberto Di Meglio, will introduce the new CERN Quantum Technology Initiative, give an overview of the Laboratory’s R&D activities and plans in this field, and give examples of the potential impact on research. It will also touch upon the rich international network of activities and how CERN fosters research collaborations.

Want to learn more on this subject?

Alberto Di Meglio is the head of CERN openlab in the IT Department at CERN and co-ordinator of the CERN Quantum Technology Initiative. Alberto is an aerospace engineer (MEng) and electronic engineer (PhD) by education and has extensive experience in the design, development and deployment of distributed computing and data infrastructures and software services for both commercial and research applications.

He joined CERN in 1998 as data centre systems engineer. In 2004, he took part in the early stages of development of the High-Energy Physics Computing Grid. From 2010 to 2013, Alberto was project director of the European Middleware Initiative (EMI), a project responsible for developing and maintaining most of the software services powering the Worldwide LHC Computing Grid.

Since 2013, Alberto has been leading CERN openlab, a long-term initiative to organise public–private collaborative R&D projects between CERN, academia and industry in ICT, computer and data science, covering many aspects of today’s technology, from heterogenous architecture and distributed computing to AI and quantum technologies.









The post The CERN Quantum Technology Initiative appeared first on CERN Courier.

]]>
Webinar This webinar will introduce the new CERN Quantum Technology Initiative, give an overview of the laboratory’s R&D activities and plans in this field, and give examples of the potential impact on research. https://cerncourier.com/wp-content/uploads/2021/03/2021-03-31-webinar-image.jpg
Iodine aerosol production could accelerate Arctic melting https://cerncourier.com/a/iodine-aerosol-production-could-accelerate-arctic-melting/ Thu, 04 Mar 2021 07:54:51 +0000 https://preview-courier.web.cern.ch/?p=91361 Studies at CERN’s CLOUD experiment reveal that part-per-trillion-by-volume iodine levels in marine regions lead to rapid formation of iodic acid particles

The post Iodine aerosol production could accelerate Arctic melting appeared first on CERN Courier.

]]>
Sea ice

Researchers at CERN’s CLOUD experiment have uncovered a new mechanism that could accelerate the loss of Arctic sea ice. In a paper published in Science on 5 February, the team showed that aerosol particles made of iodic acid can form extremely rapidly in the marine boundary layer – the portion of the atmosphere that is in direct contact with the ocean. Aerosol particles are important for the climate because they provide the seeds on which cloud droplets form. Marine new-particle formation is especially important since particle concentrations are low and the ocean is vast. However, how new aerosol particles form and influence clouds and climate remain relatively poorly understood.

In polar regions, aerosols and clouds have a warming effect because they absorb infrared radiation otherwise lost to space and then radiate it back down to the surface

Jasper Kirkby

“Our measurements are the first to show that the part-per-trillion-by-volume iodine levels found in marine regions will lead to rapid formation and growth of iodic acid particles,” says CLOUD spokesperson Jasper Kirkby of CERN, adding that the particle formation rate is also strongly enhanced by ions from galactic cosmic rays. “Although most atmospheric particles form from sulphuric acid, our study shows that iodic acid – which is produced by the action of sunlight and ozone on molecular iodine emitted by the sea surface, sea ice and exposed seaweed – may be the main driver in pristine marine regions.”

CLOUD is a one-of-a kind experiment that uses an ultraclean cloud chamber to measure the formation and growth of aerosol particles from a mixture of vapours under precisely controlled atmospheric conditions, including the use of a high-energy beam from the Proton Synchrotron  to simulate cosmic rays up to the top of the troposphere. Last year, the team found that small inhomogeneities in the concentrations of ammonia and nitric acid can have a major role in driving winter smog episodes in cities. The latest result is similarly important but in a completely different area, says Kirkby.

“In polar regions, aerosols and clouds have a warming effect because they absorb infrared radiation otherwise lost to space and then radiate it back down to the surface, whereas they reflect no more incoming sunlight than the snow-covered surface. As more sea surface is exposed by melting ice, the increased iodic acid aerosol and cloud-seed formation could provide a previously unaccounted positive feedback that accelerates the loss of sea ice. However, the effect has not yet been modelled so we can’t quantify it yet.”

The post Iodine aerosol production could accelerate Arctic melting appeared first on CERN Courier.

]]>
News Studies at CERN’s CLOUD experiment reveal that part-per-trillion-by-volume iodine levels in marine regions lead to rapid formation of iodic acid particles https://cerncourier.com/wp-content/uploads/2021/02/CCMarApr21_NA_Arctic.jpg
Learning language by machine https://cerncourier.com/a/learning-language-by-machine/ Fri, 05 Feb 2021 08:14:06 +0000 https://preview-courier.web.cern.ch/?p=86628 Mait Müntel left physics to found Lingvist, an education company harnessing big data and artificial intelligence to accelerate language learning.

The post Learning language by machine appeared first on CERN Courier.

]]>
Lingvist CEO Mait Müntel talks to Rachel Bray

Mait Müntel came to CERN as a summer student in 2004 and quickly became hooked on particle physics, completing a PhD in the CMS collaboration in 2008 with a thesis devoted to signatures of double-charged Higgs bosons. Continuing in the field, he was one of the first to do shifts in the CMS control room when the LHC ramped up. It was then that he realised that the real LHC data looked nothing like the Monte Carlo simulations of his student days. Many things had to be rectified, but Mait admits he was none too fond of coding and didn’t have any formal training. “I thought I would simply ‘learn by doing’,” he says. “However, with hindsight, I should probably have been more systematic in my approach.” Little did he know that, within a few years, he would be running a company with around 40 staff developing advanced language-learning algorithms.

Memory models

Despite spending long periods in the Geneva region, Mait had not found the time to pick up French. Frustrated, he began to take an interest in the use of computers to help humans learn languages at an accelerated speed. “I wanted to analyse from a statistical point of view the language people were actually speaking, which, having spent several years learning both Russian and English, I was convinced was very different to what is found in academic books and courses,” he says. Over the course of one weekend, he wrote a software crawler that enabled him to download a collection of French subtitles from a film database. His next step was to study memory models to understand how one acquires new knowledge, calculating that, if a computer program could intelligently decide what would be optimal to learn in the next moment, it would be possible to learn a language in only 200 hours. He started building some software using ROOT (the object-oriented program and library developed by CERN for data analysis) and, within two weeks, was able to read a proper book in French. “I had included a huge book library in the software and as the computer knew my level of vocabulary, it could recommend books for me. This was immensely gratifying and pushed me to progress even further.” Two months later, he passed the national French language exam in Estonia.

Mait became convinced that he had to do something with his idea. So he went on holiday, and hired two software developers to develop his code so it would work on the web. Whilst on holiday, he happened to meet a friend of a friend, who helped him set up Lingvist as a company. Estonia, he says, has a fantastic start-up and software-development culture thanks to Skype, which was invented there. Later, Mait met the technical co-founder of Skype at a conference, who coincidentally had been working on software to accelerate human learning. He dropped his attempts and became Lingvist’s first investor.

Short-term memory capabilities can differ between five minutes and two seconds!

Mait Müntel

The pair secured a generous grant from the European Union Horizon 2020 programme and things were falling into place, though it wasn’t all easy says Mait: “You can use the analogy of sitting in a nice warm office at CERN, surrounded by beautiful mountains. In the office, you are safe and protected, but if you go outside and climb the mountains, you encounter rain and hail, it is an uphill struggle and very uncomfortable, but immensely satisfying when you reach the summit. Even if you work more than 100 hours per week.”

Lingvist currently has three million users, and Mait is convinced that the technology can be applied to all types of education. “What our data have demonstrated is that levels of learning in people are very different. Short-term memory capabilities can differ between five minutes and two seconds! Currently, based on our data, the older generation has much better memory characteristics. The benefit of our software is that it measures memory, and no matter one’s retention capabilities, the software will help improve retention rates.”

New talents

Faced with a future where artificial intelligence will make many jobs extinct, and many people will need to retrain, competitiveness will be derived from the speed at which people can learn, says Mait. He is now building Lingvist’s data-science research team to grow the company to its full potential, and is always on the lookout for new CERN talent. “Traditionally, physicists have excellent modelling, machine-learning and data-analysis skills, even though they might not be aware of it,” he says.

The post Learning language by machine appeared first on CERN Courier.

]]>
Careers Mait Müntel left physics to found Lingvist, an education company harnessing big data and artificial intelligence to accelerate language learning. https://cerncourier.com/wp-content/uploads/2020/02/CCMarApr_Careers.jpg
From CERN technologies to medical applications https://cerncourier.com/a/from-cern-technologies-to-medical-applications/ Wed, 03 Feb 2021 09:39:51 +0000 https://preview-courier.web.cern.ch/?p=91008 Watch this webinar now, sponsored by Hiden Analytical and Instrumentation Technologies.

The post From CERN technologies to medical applications appeared first on CERN Courier.

]]>
By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

Besides the intrinsic worth of the knowledge that it generates, particle physics often acts as a trailblazer in developing cutting-edge technologies in the fields of accelerators, detectors and computing. These technologies, and the human expertise associated with them, find applications in a variety of areas, including the biomedical field, and can have a societal impact going way beyond their initial scope and expectations.

This webinar will introduce the knowledge-transfer goals of CERN, give an overview of the Laboratory’s medical-applications-related activities and give examples of the impact of CERN technologies on medtech: from hadrontherapy to medical imaging, flash radiotherapy, computing and simulation tools. It will also touch upon the challenges of transferring the technologies and know-how from CERN to the medtech industry and medical research.

Want to learn more on this subject?

Dr Manuela Cirilli is the deputy group leader of CERN’s Knowledge Transfer (KT) group, whose mission is to maximise the impact of CERN on society by creating opportunities for the transfer of the Laboratory’s technologies and know-how to fields outside particle physics. Manuela leads the Medical Applications section of the KT group and chairs the CERN Medical Applications Project Forum. She has an academic background in particle physics and science communication. In 1997, she started working on the NA48 experiment at CERN, designed to measure CP violation in the kaon system. In 2001, she began working on the construction, commissioning and calibration of the precision muon chambers of the ATLAS experiment at the LHC, until she joined CERN’s KT group in 2010.

In parallel to her career, Manuela has been actively engaging in science communication and popularisation since the early 2000s.



The post From CERN technologies to medical applications appeared first on CERN Courier.

]]>
Webinar Watch this webinar now, sponsored by Hiden Analytical and Instrumentation Technologies. https://cerncourier.com/wp-content/uploads/2021/01/2021-02-25-webinar.jpg
HEP-based ventilator to be adapted for clinical use https://cerncourier.com/a/hep-based-ventilator-to-be-adapted-for-clinical-use/ Wed, 27 Jan 2021 08:40:49 +0000 https://preview-courier.web.cern.ch/?p=90797 A UK team is adapting a low-cost ventilator developed at CERN for clinical use in the fight against COVID-19.

The post HEP-based ventilator to be adapted for clinical use appeared first on CERN Courier.

]]>
High Energy physics Ventilator

A versatile ventilator to help combat COVID-19 developed by members of the LHCb collaboration is to be re-engineered for manufacture and clinical use. The High Performance Low-cost Ventilator (HPLV) is designed to assist patients in low- and middle-income countries suffering from severe respiratory problems as a result of COVID-19. Following the award of £760,000 by UK Research and Innovation, announced in December, Ian Lazarus of the Science and Technology Facilities Council’s Daresbury Laboratory and co-workers aim to produce and test plans for the creation of an affordable, reliable and easy to operate ventilator that does not rely so heavily on compressed gases and mains electricity supply.

“I am proud to be leading the HPLV team in which we have brought together experts from medicine, science, engineering and knowledge transfer with a shared goal to make resilient high-quality ventilators available in areas of the world that currently don’t have enough of them,” said Lazarus in a press release. 

While the majority of people who contract COVID-19 suffer mild symptoms, in some cases the disease can cause severe breathing difficulties and pneumonia. For such patients, the availability of ventilators that deliver oxygen to the lungs while removing carbon dioxide is critical. Commercially available ventilators are typically costly, require considerable experience to use, and often rely on the provision of high-flow oxygen and medically pure compressed air, which are not readily available in many countries.

The HPLV takes as its starting point the High Energy physics Ventilator (HEV), which was inspired by an initiative at the University of Liverpool and developed at CERN in March 2019 during the first COVID-19 lockdown. The idea emerged when physicists and engineers in LHCb’s vertex locator (VELO) group realised that the systems which are routinely used to supply and control gas at desired temperatures and pressures in particle-physics detectors are well matched to the techniques required to build and operate a ventilator (CERN Courier May/June 2020 p8). HPLV will see the hardware and software of HEV adapted to make it ready for regulatory approval and manufacture. Project partners at the Federal Institute of Rio de Janeiro in Brazil – in collaboration with CERN, the University of Birmingham, the University of Liverpool and the UK’s Medical Devices Testing and Evaluation Centre – will now identify difficulties encountered when ventilating patients and pass that information to the design team to ensure that the HPLV is fit for purpose.

“We warmly welcome the HPLV initiative, and look forward to working together with the outstanding HPLV team for our common humanitarian goal,” says Paula Collins, who co-leads the HEV project with CERN and LHCb colleague Jan Buytaert. The HPLV is one of several HEV offshoots involving 25 academic partners, she explains. “In December we also saw the first HEV prototypes to be constructed outside CERN, at the Swiss company Jean Gallay SA, which specialises in engineering for aerospace and energy. We have continued our outreach worldwide, and in particular wish to highlight an agreement being built up with a company in India that plans to modify the HEV design for local needs. None of this would have been possible without the incredible support and advice received from the medical community.”

The post HEP-based ventilator to be adapted for clinical use appeared first on CERN Courier.

]]>
News A UK team is adapting a low-cost ventilator developed at CERN for clinical use in the fight against COVID-19. https://cerncourier.com/wp-content/uploads/2021/01/CCJanFeb21_NA_ventilator.jpg
European projects boost CERN’s medical applications https://cerncourier.com/a/european-projects-boost-cerns-medical-applications/ Tue, 26 Jan 2021 09:49:39 +0000 https://preview-courier.web.cern.ch/?p=90794 CERN's Next Ion Medical Machine Study (NIMMS) and MEDICIS facility work towards next- generation medical treatments.

The post European projects boost CERN’s medical applications appeared first on CERN Courier.

]]>
The MedAustron proton/carbon-ion synchrotron

A CERN-based effort to bring about the next generation of hadron-therapy facilities has obtained new funding from the European Commission (EC) to pursue technology R&D. CERN’s Next Ion Medical Machine Study (NIMMS) aims to drive a new European effort for ion-beam therapy based on smaller, cheaper accelerators that allow faster treatments, operation with multiple ions, and patient irradiation from different angles using a compact gantry system. Its predecessor the Proton-Ion Medical Machine Study (PIMMS), which was undertaken at CERN during the late 1990s, underpinned the CNAO (Italy) and MedAustron (Austria) treatment centres that helped propel Europe to the forefront of hadron therapy.

Covering the period 2021–2024, two recently approved EC Horizon 2020 Research Infrastructure projects will support NIMMS while also connecting its activities to collaborating institutes throughout Europe. The multidisciplinary HITRIplus project (Heavy Ion Therapy Research Integration) includes work packages dedicated to accelerator, gantry and superconducting magnet design. The IFAST project (Innovation Fostering in Accelerator Science and Technology) will include activities on prototyping superconducting magnets for ion therapy with industry, together with many other actions related to advanced accelerator R&D.

“Over the past three years we have collected about €4 million of EC contributions, directed to a collaboration of more than 15 partners, representing about a factor of eight leverage on the original CERN funding,” says NIMMS project leader Maurizio Vretenar. “A key achievement was the simultaneous approval of HITRIplus and IFAST because they contain three strong work packages built around the NIMMS work-plan and associate our work with a wide collaboration of institutes.”

A major NIMMS partner is the new South East European International Institute for Sustainable Technologies (SEEIIST), an initiative started by former CERN Director-General Herwig Schopper and former minister of science for Montenegro Sanja Damjanovic, which aims to build a pan-European facility for cancer research and therapy with ions in South East Europe. CNAO and MedAustron are closely involved in the superconducting gantry design, CIEMAT in Spain will build a high-frequency linac section, and INFN is developing new superconducting magnets, with the TERA Foundation continuing to underpin medical-accelerator R&D.

MEDICIS success

Also successful in securing new Horizon 2020 funding is a project built around CERN’s MEDICIS facility, which is devoted to the production of novel radioisotopes for medical research together with institutes in life and medical sciences. The PRISMAP project (the European medical isotope programme) will bring together key facilities in the provision of high-purity-grade new radionuclides to advance early-phase research into radiopharmaceuticals, targeted drugs for cancer, “theranostics” and personalised medicine in Europe.

MEDICIS is now concluding its programme with the separation of 225Ac, a fast-emerging radionuclide for the rising field of targeted alpha therapy.

A successful programme towards this goal was developed by MEDICIS during the past two years, with partner institutes providing sources that were purified on a MEDICIS beamline using mass separation, explains Thierry Stora of CERN. “Our programme was particularly impressive this year, with record separation efficiencies of more than 50% met for 167Tm, the first medical isotope produced at CERN 40 years ago with somewhat lower efficiencies,” he says. “It also allowed the translation of 153Sm, already used in low specific activity grades for palliative treatments, to R&D for new therapeutic applications.” MEDICIS is now concluding its programme with the separation of 225Ac, a fast-emerging radionuclide for the rising field of targeted alpha therapy. “Isotope mass separation at MEDICIS acted as a catalyst for the creation of the European medical isotope programme,” says Stora, who leads the MEDICIS facility.

Together with other project consortia, the MEDICIS and HITRIplus teams are also working to identify the relevance of their research for the EC’s future cancer mission, which is part of its next framework programme, Horizon Europe, beginning this year.

Two further EC Horizon 2020 projects launched by CERN – AIDAinnova, which will enable collaboration on common detector projects, and RADNEXT, which will provide a network of irradiation facilities to test state-of-the-art microelectronics – were approved in November. “These results demonstrate CERN’s outstanding success rate in research-infrastructure projects,” says Svet Stavrev, head of CERN’s EU projects management and operational support section. “Since the beginning of the programme, Horizon 2020 has provided valuable support to major projects, studies and initiatives for accelerator and detector R&D in the particle-physics community.”

The post European projects boost CERN’s medical applications appeared first on CERN Courier.

]]>
News CERN's Next Ion Medical Machine Study (NIMMS) and MEDICIS facility work towards next- generation medical treatments. https://cerncourier.com/wp-content/uploads/2019/08/MedAustron2-1.jpg
Hyperloop: think big, win big https://cerncourier.com/a/hyperloop-think-big-win-big/ Wed, 06 Jan 2021 09:27:18 +0000 https://preview-courier.web.cern.ch/?p=90518 Leybold’s Tom Kammermeier on the German manufacturer’s long-range bet on hyperloop vacuum-based transportation systems.

The post Hyperloop: think big, win big appeared first on CERN Courier.

]]>
Tom Kammermeier

Tom Kammermeier is an industrial physicist in a hurry. Hardly surprising given that the commercial roadmap he’s following points to a multibillion-dollar opportunity for vacuum equipment makers – an opportunity that, in turn, promises to transform ground-based mass-transportation of people and goods over the coming decades using energy-efficient hyperloop technologies.

Put simply: if technology hype translates into commercial reality, today’s proof-of-principle hyperloop test facilities will, ultimately, scale up to enable the transit of passenger and freight capsules from A to B through steel tubes (roughly 4 m in diameter) maintained at partial vacuum (typically less than 1 mbar). The end-game: journeys of several hundred kilometres at speeds in excess of 1000 km/h – Los Angeles to San Francisco, Mumbai to Chennai, Montreal to Toronto are just some of the high-demand routes on the drawing board – with maglev technologies teed up to provide the required propulsion, acceleration and deceleration along the way.

The end-game for hyperloop is journeys of several hundred kilometres at speeds in excess of 1000 km/h

While the journey to commercial hyperloop deployment is only just beginning, a thriving and diverse innovation ecosystem is already hard at work, with heavily financed technology start-ups and dozens of academic groups and established manufacturers coalescing into a nascent hyperloop supply chain. As Leybold’s global application development manager (industrial vacuum), Kammermeier is front-and-centre in the German manufacturer’s efforts to establish itself as the “go-to” vacuum technology partner for the hyperloop development community. Here he talks to CERN Courier about the trade-offs, challenges and near-term benefits of playing the long game on technology strategy.

How does your application development team support technical and commercial outcomes within Leybold?

I coordinate a team of 20 application specialists worldwide who handle what we call third-level product support – essentially any unique or non-standard technical requests that get referred to us by our regional sales and field engineering colleagues. In each case, we’ll work closely with Leybold’s product engineering and R&D teams to come up with solutions, ensuring that any new learning and insights are shared across the organisation through a structured programme of knowledge dissemination – online webinars, tutorial videos and the like. Our remit also includes the investigation and development of new vacuum applications. This work is informed by emerging customer needs in markets where Leybold already has an established presence – for example, surface coatings, semiconductors, solar technology and food and drink – as well as evaluation of longer-range commercial applications like hyperloop transportation.

What’s the back-story to Leybold’s engagement with the hyperloop community?

The hyperloop opportunity was initially championed at Leybold back in 2015 by my colleague Carl Brockmeyer, who at the time was head of new business development (and is now president of Leybold’s scientific vacuum division). While Carl articulated the long-term commercial vision, my team focused on initial simulations and high-level requirements-gathering for the enabling vacuum technologies. At the outset, we worked closely with pioneering development companies such as Hyperloop Transportation Technologies (HTT) in the US and Virgin Hyperloop (US), while subsequent collaborations include TransPod (Canada) and the EuroTube Foundation (Switzerland).

I’m a physicist by training and, from the off, it was evident to me that there are no insurmountable technical barriers to hyperloop transportation. As such, it seems clear that the large-scale deployment of hyperloop systems will ultimately be driven by policy-makers and by commercial factors such as capital/operational costs versus return on investment.

Hyperloop represents a long-term commercial opportunity for Leybold. Are there any near-term upsides?

The calculus is simple: in the absence of volume orders, we invest time and resources in early-stage R&D collaborations with leading hyperloop companies in return for the publicity, benefits of association and the acquisition of technical and commercial domain knowledge. The work is bursty and comes in waves – essentially an R&D programme and reciprocal learning exercise at this stage. More widely, we’re seeing some payback in our established market sectors, where the hyperloop activity has opened doors with new customers who might not know Leybold so well. What we see is that hyperloop is a great topic for our sales teams to talk about – it’s very relatable.

What do these hyperloop collaborations typically involve?

Our approach is project-led, bringing together ad hoc teams of engineering, simulation and application specialists to address a range of customer requirements. Most of our collaborations to date have kicked off with simulation studies – a relatively cheap way to test the water and build a fundamental understanding of hyperloop vacuum systems and their core technologies.

Virgin Hyperloop’s DevLoop test facility

It wasn’t long, however, before our systems group began supplying one-off hardware orders, including a large-scale vacuum pumping unit for Virgin Hyperloop’s DevLoop test facility in the Nevada desert. While this is a custom installation, it’s
based on existing commercial pumping units that we sell into steel degassing applications, though with several modifications to the programmable controller.

There’s been lots of hype about hyperloop over the last five years. How do you see the market trajectory right now?

My take is that hyperloop R&D and commercialisation activities are gathering pace, as evidenced by the first successful demonstration of human travel in a hyperloop pod at Virgin Hyperloop’s DevLoop test site back in October. This represents a significant breakthrough after more than 400 previously unoccupied test-runs at DevLoop. Elsewhere, we recently sold another big pumping system into HTT for its work-in-progress test-track near Toulouse, France. We’re frequently in contact with them regarding simulation or engineering considerations, with safety-critical aspects very much to the fore as HTT also plans to transport human passengers in the near future. 

What sort of technical challenges is Leybold being asked to address by hyperloop developers?

Pumping down a hyperloop vacuum tube over hundreds of kilometres is a non-trivial engineering challenge. From a vacuum perspective, you need to think carefully about the spacing of your pumping stations along the tube; optimisation of each pumping system; what happens in case of tube failures or accidents; and how the distributed pumping network can provide back-up pumping capacity and compensation (see “Hyperloop: rewriting the rules of large-scale vacuum”).

Hyperloop: rewriting the rules of large-scale vacuum

Custom pumping systems

“Pumping down a hyperloop vacuum tube over hundreds of kilometres is a non-trivial engineering challenge,” notes Leybold’s Tom Kammermeier in our accompanying interview. Here he outlines some of the key design and engineering considerations for hyperloop vacuum systems.

Location, location, location

The aspect ratio (diameter/length) of a hyperloop system is enormous – 1/1000,000 is easily within reach – and imposes inescapable design constraints in terms of vacuum pumping capability. A single-site pumping station, while minimising capital outlay, would result in some odd pressure distributions and gradients along the hyperloop track. During pump-down, for example, the operator might register the target base pressure at one end of the pipe while the other end is still at atmospheric pressure. What’s needed instead is an intelligent distribution of pumping capacity along the track – crucial for compensation of any leaks and pump failures, and doubly so in terms of reducing capital/operational expenditure (as every additional pumping site means more outlay in terms of enclosures, power supply, water supply and associated infrastructure).

Smart strategies for leak management

A vacuum system can be defined along a number of coordinates, not least in terms of its pump-down requirements and target operating pressure (where the total pumping speed equals the inleak flow rate). The higher the permissible operating pressure, the lower the pumping speed, and the greater the aggregate energy savings over time. A large-scale hyperloop system will therefore require a smart pumping network to optimise the distribution of pumping speed dynamically versus local inleak flow conditions – a capability that, in turn, will yield significant (and recurring) operational savings. It’s also worth noting that an understanding of the pumping-speed distribution (essentially a granular map of pressure along the tube) will enable efficient leak detection without recourse to a conventional and time-consuming leak search.

Gearing up, pumping down

Peak energy consumption for any hyperloop vacuum system will occur during end-to-end pump-down along the track. With this in mind, Leybold is working to optimise its multistage Roots pumping systems for the very long pump-downs (of the order of 12–24 hours) that will be required in large-scale hyperloop tubes. Roots pumps are an excellent option for high-volume flows at low pressures – i.e. the usual operating regime of hyperloop systems – but their efficient use for an extended pump-down from atmospheric pressure is problematic. Issues can include overheating due to gas compression; overload of the motor; or exceeding temperature limits due to low heat dissipation at low gas pressures. The answer is to employ variable-speed drives, which basically “know” the thermodynamics of each individual pump and enable optimised use. In this way, the programmable logic controller of the pumping system is able to orchestrate the individual pumps to yield the highest possible pumping speed during a pump-down – equating to some millions of m3/h for a 1000 km track.

What lessons have you learned from Leybold’s engagement with the hyperloop community?

A lot of the learning here has been around the simulation of large-scale distributed vacuum systems – because no-one has ever built a vacuum system on the scale necessary to support commercial hyperloop transportation. We’ve had plenty of discussions to date regarding our models and whether they’re still valid over distances of several hundred kilometres, while our technology roadmap focuses on what an optimised pumping system will look like for future “live” hyperloop deployments. To date, because the market is still not mature enough, we’ve created smart hyperloop pumping systems by adapting our existing product lines – specifically, units that we’ve developed for steel-industry applications.

Is cost a big driver of your hyperloop R&D priorities?

Always. Cost-of-ownership calculations feature prominently in discussions with all our hyperloop customers. We’ve given a lot of input, for example, on required pumping speed versus leak flow rate versus operating pressure. Fundamental studies like this help our partners to evaluate whether it’s worth focusing more of their investments on a leak-tight pipe or on the vacuum pumping systems. Another priority for developers is energy consumption, so our system-level simulations provide vital insights for the accurate calculation of pump-down time and vacuum performance versus energy budget. In this context, it’s worth noting that Leybold’s DRYVAC Energy Saver – which reduces the energy consumption of our dry compressing screw pumps and systems by as much as 50% – is emerging as a potential game-changer for the large-scale pumping systems that will underpin hyperloop installations.

Are vacuum equipment makers ready if hyperloop’s technology push translates into market pull?

If hyperloop transportation really takes off, it will represent a massive growth market for the vacuum industry. Even a mid-size hyperloop project will require significant focus and scale-up from suppliers like Leybold. The biggest challenge will be developing, then bringing to market, a new generation of application-specific pumping systems – at the required scale and the right price-points.

The post Hyperloop: think big, win big appeared first on CERN Courier.

]]>
Opinion Leybold’s Tom Kammermeier on the German manufacturer’s long-range bet on hyperloop vacuum-based transportation systems. https://cerncourier.com/wp-content/uploads/2021/01/CCSupp_2_Vac_2020_HYPERLOOP_frontis.jpg
A joined-up vision for vacuum https://cerncourier.com/a/a-joined-up-vision-for-vacuum/ Wed, 06 Jan 2021 09:27:04 +0000 https://preview-courier.web.cern.ch/?p=90458 ESS vacuum group leader Marcelo Juni Ferreira describes the essential role of vacuum technology in this next-generation neutron-science facility.

The post A joined-up vision for vacuum appeared first on CERN Courier.

]]>
An aerial view over the ESS construction site

Neutron science 2.0 is evolving from concept to reality as construction progresses on the European Spallation Source (ESS), a €1.84 billion accelerator-driven neutron source in Lund, Sweden. ESS will deliver first science in 2023 and will, when in full operation, be the world’s most powerful neutron research facility – between 20 and 100 times brighter than the Institut Laue-Langevin (ILL) in Grenoble, France, and up to five times more powerful than the Spallation Neutron Source (SNS) in Oak Ridge, Tennessee, US.

This industrial-scale endeavour represents an amalgam of the most powerful linear proton accelerator ever built; a two-tonne, rotating tungsten target wheel (which produces neutrons via the spallation process); a reference set of 22 state-of-the-art neutron instruments for user experiments (of which 15 are under construction); and a high-performance data management and software development centre (located in Copenhagen). Here, Marcelo Juni Ferreira, vacuum group leader at ESS, tells CERN Courier how vacuum technologies are equally fundamental to the ESS’s scientific programme.

What does your role as ESS vacuum group leader involve?

I head up a 12-strong multidisciplinary team of engineers, scientists, designers and technicians who manage the international network of stakeholders developing the vacuum infrastructure for the ESS. Many of our partners, for example, make “in-kind” contributions of equipment and personnel rather than direct cash investments from the ESS member countries. As such, the ESS vacuum group is responsible for maintaining the facility’s integrated vacuum design approach across all of these contributions and all of our vacuum systems – the proton accelerator, target section, neutron beamlines and the full suite of neutron instruments that will ultimately support user experiments (see “ESS science, funding and partnership”).

In terms of specifics, what is meant by integrated vacuum design?

The integrated approach to vacuum design works on several levels. Cost reduction is a fundamental driver for ESS. The use of standard industry components where possible reduces maintenance and training requirements, minimises the need for expensive product inventory and, through a single framework agreement covering our in-kind partners and industry suppliers, we can work at scale to lower our overall procurement costs.

Marcelo Juni Ferreira

Another motivation is to help the vacuum group support the diverse vacuum requirements across the neutron instruments. The goal in each case is to ensure sustainable, economical and long-term operation of each instrument’s vacuum plant to minimise downtime and maximise research output. To make this possible, each of the neutron instruments (and associated beamlines) has its own “vacuum interface” document summarising key technical specifications and performance requirements – all ultimately aligned with the ESS Vacuum Handbook, the main reference source promoting the use of common vacuum equipment and standards across all aspects of the project.

So, standardisation is a big part of your vacuum strategy?

Absolutely. It’s all about a unified approach to our vacuum equipment as well as the procurement policy for any major hardware/software purchases for the accelerator, the target and the neutron instruments. Another upside of standardisation is that it simplifies the interfaces between the ESS vacuum infrastructure and the ESS safety and control plant – for example, the personnel protection, machine protection and target safety systems.

ESS recently took delivery of the Target Monolith Vessel (TMV), one of the facility’s main vacuum sections. What is the TMV and who built it?

The TMV represents the core building block of the ESS target station and was assembled by our in-kind partners at ESS Bilbao, Spain, working in collaboration with local manufacturers such as Cadinox and AVS. When ESS goes online in 2023, the TMV will enclose all of the target subsystems – the target wheel, moderator, reflector plugs and cryogenic cooling – in a vacuum atmosphere and, with the help of 6000 tonnes of stainless-steel shielding, also confine any activated materials and ionising radiation in case of a highly unlikely event, such as an earthquake or accident (see “ESS operational highlights”).

The monolith is an impressive and complex piece of precision engineering in its own right. The vessel requires exacting and repeatable alignment tolerances (±25 μm) for the target wheel, the moderator and reflector assemblies relative to the incident proton beam as well as the neutron-beam extraction system. Ahead of shipping, ESS Bilbao successfully completed the leak and vacuum tests on the TMV with satisfactory measurements of dew-point temperature, pressure rise and leak detection. The final pressure obtained was 1 × 10-6mbar with a leakage < 1 × 10–8 mbar.l/s.

In terms of the TMV, how does your team design and build for maximum uptime?

The focus on project risk is a collective effort across all support functions and is framed by the ESS Strategic Installation and Test Strategy. With the TMV, for example, our design choices seek to minimise service interruptions to the scientific experiments at ESS. Put another way: each vacuum component in the TMV must offer the longest “time before failure” available on the market. In the case of the rough vacuum pumps, for example, this comes from Kashiyama Industries of Japan through ESS’s supplier Low2High Vacuum in Sweden – offering a dry vacuum pump that’s capable of 24/7, maintenance-free operation for up to three years. We’ve actually tested six of these units running at the laboratory for more than five years and none of them have required any intervention.

ESS science, funding and partnership

Large-scale neutron facilities are routinely used by academic and industrial researchers to understand material properties on the atomic scale, spurring advances across a spectrum of scientific discovery – from clean energy and environmental technology to pharma and healthcare, from structural biology and nanotech to food science and cultural heritage.

ESS is a pan-European project with 13 European nations as members: the Czech Republic, Denmark, Estonia, France, Germany, Hungary, Italy, Norway, Poland, Spain, Sweden, Switzerland and the UK.

Significant in-kind contributions of equipment and expertise – from more than 40 European partner laboratories – are expected to finance more than a third of the overall construction costs for ESS.

ESS will deliver its first science in 2023, with up to 3000 visiting researchers expected every year once the lab is fully operational.

Smart choices like this add up and result in less maintenance, reduced manual handling of active materials (e.g. pump oil) and lower cost per unit life-cycle. Similar thinking informs our approach regarding the TMV’s vacuum “plumbing”. The use of aluminium gaskets and clamps, for example, streamlines installation (compared with CF flanges) and takes into account their low neutron activation in the case of maintenance removal and reassembly ahead of resumed operations (with hands-on manipulation being faster and simpler in each case).

What are the biggest operational challenges in terms of preparing the TMV for high-reliability vacuum performance?

The major effort on the vessel was – and still is – to qualify all in-vacuum parts and connections in terms of their leak rates, pressure-code requirements and surface finishing. This includes the water-cooled shielding blocks, hydrogen-cooled moderator/deflector, and the helium cooling unit for the rotating tungsten target wheel (which employs a ferrofluidic sealing system). It’s a huge collective effort in vacuum: there are more than 1000 flanges, around 20,000 bolts and 6000 tonnes of load in the fully configured TMV (which measures 6 m internal diameter and 11 m high).

There will be two possible modes of TMV operation, with the target residing in either high vacuum or helium at slightly below atmospheric pressure. What’s the rationale here?

One of the high-level design objectives for ESS states that the TMV should be built to last for 50 years of operation while satisfying all performance and safety criteria. Our initial simulations showed that “cleanliness” of the volume surrounding the collisions of the proton beam and the tungsten target wheel will be essential for slowing material degradation and therefore delivering against this objective. What’s more, the specification of a 5 MW proton beam means that secondary gamma and neutron radiation will be produced as a side-effect of the spallation process, further emphasising the need for a controlled environment as well as appropriate cooling of the shielding blocks to counter radiation-induced heating effects.

ESS operational highlights

Fundamental principles

ESS and Daresbury vacuum teams and components

At the heart of the ESS is a linear accelerator that produces up to a 5 MW beam of 2 GeV protons, with the bulk of the acceleration generated by more than 100 superconducting radio-frequency (RF) cavities.

These accelerated protons strike a rotating tungsten target wheel (2.6 m diameter) to produce a beam of neutrons via nuclear spallation – i.e. the impact on the tungsten nuclei effectively “spalls” off free neutrons.

The target wheel rotates at 23.3 rpm and is cooled by a flowing helium gas system interfaced with a secondary water system.

The spalled neutrons pass through water premoderators, a supercritical hydrogen moderator (cooled to about 17 K) and a beryllium-lined reflector – all of which are housed in a replaceable plug – to slow the neutrons to useful energies before distribution to a suite of 15 neutron-science instruments.

The TMV has an Active Cells Facility to perform remote handling, disassembly and storage of components that are taken out of the monolith after reaching the end of their lifetime; steel shielding blocks prevent the escape of neutron/gamma ionising radiation.

TMV vacuum considerations

The TMV is designed to accommodate various leak-rate loads, including: outgassing of vacuum components; air leaks into the vacuum vessel; water leaks from internal piping plus humidity and condensation present during operations and pump down; and helium leaks from the target wheel.

Total gas in-leakage is critical and, in conjunction with the capacity of the turbomolecular pumping system, will determine not only the TMV operating pressure but also the refrigeration capacity for the cryo-condensing coil for pumping of potential water leaks.

In vacuum mode, TMV pressures < 10–4 mbar will be required for interfacing with the UHV environment of the proton accelerator (i.e. to keep gas flows into the accelerator section to an acceptable level).

TMV vacuum components (including polymer seals) must be compatible with operation up to 35 °C in harsh gamma/neutron radiation environments.

Operationally, the optimal mode of operation will be high vacuum (< 10–4 mbar), which will negate the need for a proton beam window between the proton accelerator and the target. This, in turn, will lower the annual operating costs. Other advantages include up to 1% improved neutronic performance, reduced beam scattering on the TMV components (and therefore less heat load and radiation damage), as well as a cleaner image for the beam imaging diagnostics.

Nevertheless, we will design and build a proton beam window, so that it is ready to install for operation under helium should an unanticipated issue arise with the TMV vacuum. Worth noting that in this “helium mode” a pump-and-purge capability is provided to ensure high helium purity (> 99.9%).

What lessons can other big-science facilities learn from your experiences with the ESS vacuum project?

With ESS we are entering new territory and the reliability of all our components – vacuum and otherwise – requires close collaboration as well as consistent communication on all levels with our equipment vendors and in-kind partners. Operationally, there’s no doubt that the TMV and the other ESS vacuum systems have benefited from our dedicated vacuum laboratory – one of the first in-kind hardware shipments back in 2015 – and our efforts to recruit and build a skilled team of specialists in those early days of the project. The laboratory includes test facilities for vacuum integration, gauge calibration and materials outgassing studies – capabilities that allow us to iterate and optimise field solutions in good time ahead of deployment. All of which ultimately helps us to minimise project risk, with technical decisions informed by real-world testing and not just prior experience.

The post A joined-up vision for vacuum appeared first on CERN Courier.

]]>
Opinion ESS vacuum group leader Marcelo Juni Ferreira describes the essential role of vacuum technology in this next-generation neutron-science facility. https://cerncourier.com/wp-content/uploads/2020/12/CCSupp_2_Vac_2020_FERREIRA_frontisNEW.jpg
Collaboration yields vacuum innovation https://cerncourier.com/a/collaboration-yields-vacuum-innovation/ Wed, 06 Jan 2021 09:26:53 +0000 https://preview-courier.web.cern.ch/?p=90482 CERN is home to a unique innovation ecosystem pioneering advances in vacuum science, technology and engineering

The post Collaboration yields vacuum innovation appeared first on CERN Courier.

]]>
ALICE beampipe

Vacuum represents a core enabling technology in particle accelerators. Without the required degree of vacuum, the rate of interaction between circulating particles and residual gas molecules would generate several adverse conditions. Particle beams would increase in size and so decrease in luminosity at the interaction points. Beam instability and the rate of particle loss would grow, endangering instrumentation and increasing the background noise in physics experiments. Induced radioactivity and bremsstrahlung radiation would increase risks for personnel and cause damage to the accelerator hardware. What’s more, vacuum is crucial for avoiding electrical breakdown in high-voltage equipment, as well as for thermal insulation of cryogenic fluids, reducing heat “inleaks” to acceptable levels.

Operationally, the level of vacuum required for particle accelerators spans a large range of residual gas densities – from high vacuum (HV, 10–3 to 10–9 mbar) through ultrahigh vacuum (UHV, 10–9 to 10–12 mbar) to extreme high vacuum (XHV, usually defined as 10–12 mbar and lower). Applications in thermal insulation, for example, require a gas-molecule density 10 million times lower than sea-level atmospheric pressure – i.e. less than 10–4 mbar. On the other hand, a modern synchrotron facility requires UHV residual gas densities of ≤ 10–9 mbar, while some antimatter experiments impose a rarefaction requirement in the region of 10–15 mbar. In the most challenging experiments, vacuum is an enclosed space where only several gas molecules per cm3 persist in their random motion, bouncing from one wall of the vacuum vessel to another and able to travel thousands of millions of km before striking another peer (roughly equivalent to the distance from the Sun to Jupiter).

Writ large, it is no surprise that, with more than 125 km of beampipes and liquid-helium transfer lines, CERN is home to one of the world’s largest vacuum systems – and certainly the longest and most sophisticated in terms of particle accelerators. From HV to the UHV/XHV regimes, the complexity of vacuum systems for the particle accelerators at CERN, and other big-science laboratories like it, stems largely from the interaction between particle beams and the surfaces that surround them.

Beam interactions

This “beam–surface dialogue” induces gas desorption from the vacuum system walls, an interaction that can be the dominant source of gas. Indeed, if atmospheric gas is evacuated rapidly from the vacuum system, with no in-leakage of air, it is possible to attain UHV conditions in just a few hours for chamber volumes of the order of a cubic metre. Although the vacuum-system walls release gases spontaneously – mainly water vapour and hydrogen – the choice of suitable materials and thermal treatments reduces the outgassing rates to an acceptable level before accelerator operation. As such, beam-induced gas desorption remains the biggest headache – and this effect, of course, arises only when the particle beams are in circulation.

Beam losses on the chamber walls can be a direct source of gas in the accelerator vacuum system. For the most part, however, beam-induced gas desorption occurs indirectly via the emission of synchrotron light and the beam-induced acceleration of electrons and ions created, for example, by residual gas ionisation. The synchrotron-light-induced desorption is mediated by surface–electron quantum transitions leading to the extraction of photoelectrons, which can desorb residual gas molecules in two ways – initially when leaving the chamber wall, also when striking the wall subsequently. This effect is by far the main source of gas in circular high-energy electron accelerators and plays a significant role in the Large Hadron Collider (LHC), where the critical energy of the emitted photons is around 40 eV (i.e. large enough to extract photoelectrons and induce desorption).

Vacuum diversity

It’s worth noting, though, that there’s no “instant fix” for excessive gas desorption. Even with appropriate chemical surface treatments, accelerator vacuum systems (particularly those for electrons) cannot cope with full beam performance on day one of commissioning. Instead, it is necessary to ramp up the performance of the vacuum system while the beam current is increased in a stepwise fashion. In this way, the dose of particles hitting the surfaces of the vacuum vessel increases (though without excessive beam losses), while desorption yields are reduced via surface cleaning and chemical modification. In the jargon, this optimisation of surface conditioning is known as a “scrubbing run”.

The time taken for surface conditioning can be cut dramatically with the help of nonevaporable getter (NEG) coatings, a concept developed at CERN in the late 1990s. Put simply: the beampipe walls are coated with a micrometre-thick film of Ti–Zr–V alloy that, once heated for a few hours in the accelerator at about 200 °C, provides a clean metal surface that also acts as a pump (i.e. gas molecules are adsorbed by chemical reaction at the surface). During heating, the main reservoir of gas is eliminated as the oxide passivation layer dissolves into the film; after which the cycle repeats whenever adsorption of gas molecules saturates the surface or air venting is necessary.

This NEG capability is deployed at scale by CERN. The 6 km-long beam lines of the LHC’s room-temperature straight sections, for example, are coated entirely with NEG materials, while uptake in several synchrotron research facilities is now envisaged after a pioneering implementation in MAX IV, the Swedish synchrotron. In summary: NEG coatings combine distributed, high-speed pumping with negligible space requirements – a win–win for small-diameter beampipes in the current generation of electron accelerators.

Another significant component of the beam–surface dialogue within particle accelerators is the heating of materials exposed to the circulating beams. One of two possible tracks for the transfer of thermal power is the interaction between the electromagnetic field generated by the beams with the surrounding materials, a process that induces electrical currents on the beam-facing surfaces.

Support for projects like the HL-LHC requires full cognisance of some pretty harsh operating environments

These currents may in turn give rise to Joule heating, typically mitigated by using a good electrical conductor (like copper) as the material of choice for the beampipes or as a layer deposited on stainless steel, usually via electrolytic techniques. Geometrical discontinuities of the vacuum chambers may also result in resonant interaction with the beam, creating enhanced local power dissipation in trapped modes – a problem that can be solved through optimised design of the vacuum chambers and their transitions.

Taken together, these mitigation measures have another highly beneficial side-effect. Beam-induced surface currents generate electromagnetic fields which, in turn, interact back with the beam, potentially disrupting its characteristics or its long-term stability in the accelerator. As such, the overall drive to reduce the impedance of the vacuum system (and of all in-vacuum components) results in longer beam lifetimes and preserved beam emittance, ultimately leading to higher collision rates in physics experiments.

The heat is still on

Ongoing innovation will be essential, however. In the next generation of high-energy proton accelerators operating with superfluid helium – the proposed Future Circular Collider (FCC-hh) is a case in point – the impedance of the beampipes could prove detrimental for the global heat-load balance of the cryogenic system. To counter this heat source, CERN has initiated an ambitious feasibility study in which the inner walls of vacuum chambers are coated with high-temperature superconductors (HTS). Owing to the much-reduced electrical losses of superconductors versus normal metals, successful use of HTS promises to yield a considerable impedance reduction. It’s early days, but initial results with HTS rare-earth barium copper oxide (ReBCO) test coatings are extremely encouraging.

At the same time, synchrotron radiation and electrons hitting the walls of the vacuum system also convey part of the beam power to the surrounding vessels. The multiplication of impinging electrons by the surface and their acceleration by the beam – a process known as electron multipacting – is of concern for cryogenic systems. In the LHC, for example, the heat load is intercepted by an intermediate wall that’s maintained at a temperature of 10–20 K rather than 1.9 K (which is the temperature of the cold bore – i.e. the chamber in tight contact with the magnet). Underpinning this arrangement is the insertion into the cold bore of an additional pipe – the so-called beam screen – which is made of copper-colaminated stainless-steel and cooled by a dedicated helium circuit. The beam screen and cold bore in turn communicate through pumping slots so that gas molecules are cryoadsorbed on the coldest surface.

CERN’s vacuum roadmap: collaboration is key

The VAX vacuum module

The evolution of vacuum technology and engineering at CERN is strictly aligned versus accelerator operation and priorities; the organisation’s fundamental science programme; and, at a high level, the 2020 update of the European Strategy for Particle Physics. As the restart of the LHC physics programme approaches (slated for early 2022), the reliability of the CERN vacuum system is our primary focus – especially after a shutdown that will have run to more than two years.

For sure, 2021 will be an intense period for the CERN vacuum team. An immediate concern is the restart of beam circulation in vacuum systems that were open to the air for planned interventions and modification – sometimes for several days or weeks. The heat load generated by the beams in the LHC’s arcs will be under the spotlight as well as the performance of the upgraded LHC’s injector chain. There is no doubt that our nights will be filled with worries – worries that will hopefully dissipate as new science breakthroughs are announced for the LHC’s beams and detectors. 

Maintaining momentum

In parallel, we will maintain the pace of the HL-LHC programme, implementing vacuum innovations elaborated in the past five years. Chief among them are the new beam screens for the triplet magnets of the two high-luminosity experiments – CMS and ATLAS. This advanced concept integrates a carbon coating (as electron multipacting suppressor) and tungsten blocks (to absorb collision debris before it interacts with the magnets). Design optimisation required several iterations and the running of multiphysics programs. The vacuum team subsequently evaluated the mechanical stability of the HL-LHC beam screen during the electromagnetic and thermal transient generated by magnet quench (i.e. a sudden loss of superconducting properties). Experimental investigations of the vacuum performance – via measurement of adsorption isotherms – allowed us to choose 60 K as the operational temperature for the new beam screen.

Another notable HL-LHC achievement is the vacuum module installed between the last focusing magnet of the accelerator and the high-luminosity experiments. Referred to as VAX, this arrangement comprises a compact set of vacuum components, pumps, valves and gauges installed in an area of limited access and relatively high radioactivity. As such, the VAX design is fully compatible with robot intervention, enabling leak detection, gasket change and complete removal of parts to be carried out remotely and safely. The direction of travel is clear: robotic technologies will have a pivotal role to play in the vacuum systems of next-generation, high-intensity particle accelerators.

Joined-up thinking

Operationally, it is already time to prepare CERN and a new generation of vacuum experts for the post-LHC era. Our reference point is the aforementioned European Strategy for Particle Physics, with its initial prioritisation of an electron–positron Higgs factory to be followed, in the long run, by a 100 km-circumference proton–proton collider at “the highest achievable energy”.

These accelerators will push vacuum science and technology to the limit, amplifying the challenges that we have today with the LHC. Yet there’s plenty of encouraging progress to report. An optimised design for the vacuum chambers is already in the works, thanks to advanced simulations of synchrotron radiation and gas molecule distribution performed using CERN-maintained software. Furthermore, the Karlsruhe Research Accelerator (KARA) in Germany reports excellent results in its evaluations of the proton–proton prototype vacuum chamber. The biggest challenge remains cost: engineering solutions adopted at the km scale cannot be implemented for systems 10 to 100 times longer – the vacuum system would be prohibitively expensive.

Herein lies an opportunity – and more specifically a call to arms for vacuum specialists to work collaboratively across their respective disciplines to imagine, and subsequently deliver, the technology innovations that will address the economic challenges of big science in the 21st century. The potential synergies are already evident as the next generation of particle accelerators takes shape alongside new concepts for advanced gravitational-wave telescopes.  Diverse physics initiatives with a common interest in driving down the cost of their enabling vacuum systems.

A granular understanding of the fundamental physics certainly helps here. While synchrotron radiation power depends only on the beam parameters, the contribution of electrons to the heat load depends on the surface parameters, above all the secondary electron yield – i.e. the ratio of emitted electrons versus incident electrons. This important characteristic of the surface walls decreases as the dose of impinging electrons accumulates – an additional outcome of beam conditioning. That said, such a decrease takes time and dedicated beam runs, while the mechanism of beam conditioning seems more complex than at first anticipated (as observed during Run 2 of the LHC from 2014–2018). In terms of specifics, the heat load transferred in the beam-screen cooling circuit was found to be higher than expected in four of the LHC’s eight arcs. CERN’s surface experts investigated several surface characteristics to understand this phenomenon and, finally, spotted anomalous behaviour in copper oxide that could lead to a less effective decrease of the secondary electron yield.

The sheer scale of CERN’s vacuum infrastructure represents an engineering challenge in its own right

To circumvent the need for additional beam conditioning, CERN’s vacuum group has developed amorphous carbon coatings with very low secondary-electron yields to effectively prevent electron multipacting. Such thin films are the baseline for the beampipes of the final focusing magnets for the High-Luminosity LHC (HL-LHC) upgrade, presently under way. The carbon coatings have also been implemented in selected areas of the Super Proton Synchrotron (which injects protons into the LHC) to reduce the direct effect of electron clouds on beam performance.

Another countermeasure to electron multipacting involves increasing the roughness of the walls of the vacuum vessel, such that secondary electrons are intercepted by the surfaces before they can be accelerated by the beam. In this instance, the CERN vacuum group is implementing laser treatments developed by two UK research centres – STFC Daresbury Laboratory and the University of Dundee. The laser, which is introduced into the beampipes using a dedicated robot from GE Inspection Robotics, engraves small grooves azimuthally, with a spacing of a few tens of micrometres. Furthermore, the redeposition of ablated material superposes nanometric particles that enhance the electron-capture effect.

Measurement and control

Zoom out from the esoteric complexity of beam–surface interactions and the sheer scale of CERN’s vacuum infrastructure represents an engineering challenge in its own right – not least in terms of vacuum metrology, diagnostics and control. In all, more than 12,000 vacuum instruments – gauges, pumps, valves and associated controllers with almost a million configuration settings – are managed via a flexible database running in the Cloud. Work is well advanced to mine the vast amounts of data generated by this network of vacuum systems – ultimately creating a “data-streaming pipeline” that will integrate the latest analytics software with a new generation of open-source diagnostic and reporting tools.

Preparation of amorphous carbon coatings

Meanwhile, at the operational sharp-end, the measurement of extremely low pressures remains a core competency of the CERN vacuum team. This capability preserves, indeed builds on, the legacy of the Intersecting Storage Rings (ISR), the world’s first hadron collider and a pioneering environment for vacuum technology during the late 1960s and 1970s. The vacuum gauges operating at CERN today in the 10–7–10–12 mbar range are copies of the original models adopted for the ISR, while those in use in CERN’s R&D laboratories and in antimatter experiments (for measurement down to 10–14 mbar) are the result of further developments in the late 1970s.

Studies of vacuum gauges to provide continuous measurement at even lower pressures are also under way at CERN, often in collaboration with Europe’s metrological community. In the framework of the EURAMET-EMPIR programme, for example, CERN vacuum experts have participated in the development and characterisation of a vacuum gauge with an ultrastable sensitivity for the transfer of vacuum standards amongst European research institutes (see “Vacuum metrology: made to measure”).

More broadly, support for projects like the HL-LHC requires full cognisance of some pretty harsh operating environments. Fundamentally, increasing beam currents means that vacuum systems and their electronic control circuits are more and more susceptible to radiation damage. A key determinant of the global cost/performance of a large-scale vacuum system is the deployment of electronics in the accelerator tunnels – with weaknesses in the devices gradually revealed through increasing radiation exposure. With this in mind, and by using radiation sources available on site as well as at other European research institutes, the CERN vacuum team has been busy evaluating the “radiation hardness” of hundreds of critical components and electronic devices.

Looking to the future, it’s evident that major accelerator initiatives such as the HL-LHC and the proposed FCC will maintain CERN’s role as one of the world’s leading R&D centres for vacuum science and technology – a specialist capability that will ultimately support fundamental scientific advances at CERN and beyond. 

The post Collaboration yields vacuum innovation appeared first on CERN Courier.

]]>
Feature CERN is home to a unique innovation ecosystem pioneering advances in vacuum science, technology and engineering https://cerncourier.com/wp-content/uploads/2021/01/CCSupp_2_Vac_2020_INNOV_frontis_feature.jpg
Vacuum metrology: made to measure https://cerncourier.com/a/vacuum-metrology-made-to-measure/ Wed, 06 Jan 2021 09:26:43 +0000 https://preview-courier.web.cern.ch/?p=90492 A pan-European consortium is working towards an international standard for the commercial manufacture of ionisation vacuum gauges.

The post Vacuum metrology: made to measure appeared first on CERN Courier.

]]>
PTB scientists Karl Jousten and Claus Illgen

Absence, it seems, can sometimes manifest as a ubiquitous presence. High and ultrahigh vacuum – broadly the “nothingness” defined by the pressure range spanning 0.1 Pa (0.001 mbar) through 10–9 Pa – is a case in point. HV/UHV environments are, after all, indispensable features of all manner of scientific endeavours – from particle accelerators and fusion research to electron microscopy and surface analysis – as well as a fixture of diverse multibillion-dollar industries, including semiconductors, computing, solar cells and optical coatings.

For context, the ionisation vacuum gauge is the only instrument able to make pressure measurements in the HV/UHV regime, exploiting the electron-induced ionisation of gas molecules within the gauge volume to generate a current that’s proportional to pressure (see figure 1 in “Better traceability for big-science vacuum measurements”). Integrated within a residual gas analyser (RGA), for example, these workhorse instruments effectively “police” HV/UHV systems at a granular level – ensuring safe and reliable operation of large-scale research facilities by monitoring vacuum quality (detecting impurities at the sub-ppm level), providing in situ leak detection and checking the integrity of vacuum seals and feed-throughs.

Setting the standard

Notwithstanding the ubiquity of HV/UHV systems, it’s clear that many scientific and industrial users are sure to gain – and significantly so – from an enhanced approach to pressure measurement in this rarefied domain. For their part, HV/UHV end-users, metrology experts and the International Standards Organisation (ISO) all acknowledge the need for improved functionality and greater standardisation across commercial ionisation gauges – in short, enhanced accuracy and reproducibility plus more uniform sensitivity versus a broad spectrum of gas species.

That wish-list, it turns out, is the remit of an ambitious pan-European vacuum metrology initiative – the catchily titled 16NRM05 Ion Gauge – within the European Metrology Programme for Innovation Research (EMPIR), which in turn is overseen by the European Association of National Metrology Institutes (EURAMET). As completion of its three-year R&D effort approaches, it seems the EMPIR 16NRM05 consortium is well on its way to finalising the design parameters for a new ISO standard for ionisation vacuum gauges that will combine improved accuracy (total relative uncertainty of 1%), robustness and long-term stability with known relative gas sensitivity factors.

Its a design that cannot be found on the market…The results have been very encouraging

Another priority for EMPIR 16NRM05 is “design for manufacturability”, such that any specialist manufacturer will be able to produce standardised, next-generation ionisation gauges at scale. “We work closely with the gauge manufac­turers – VACOM of Germany and INFICON of Liechtenstein are consortium members – to make sure that any future standard will result in an instrument that is easy to use and economical to produce,” explains Karl Jousten, project lead and head of section for vacuum metrology at Physikalisch-Technische Bundesanstalt (PTB), Germany’s national measurement institute (NMI) in Berlin.

In fact, this engagement with industry underpins the project’s efforts to unify something of a fragmented supply chain. Put simply: manufacturers currently use a range of electrode materials, operating potentials and, most importantly, geometries to define their respective portfolios of ionisation gauges. “It’s no surprise,” Jousten adds, “that gauges from different vendors vary significantly in terms of their relative sensitivity factors. What’s more, all commercially available gauges lack long-term and transport stability – the instability being about 5% over one year.”

The EMPIR 16NRM05 project partners – five national measurement institutes (including PTB), VACOM and INFICON, along with vacuum experts from CERN and the University of Lisbon – have sought to bring order to this disorder by designing an ionisation gauge that is at once compatible with standardisation while exceeding current performance levels. When the project kicked off in summer 2017, for example, the partners set themselves the goal of improving the relative standard uncertainty due to long-term and transport instability from about 5% to below 1% for nitrogen gas. Another priority involves tightening the spread of sensitivity factors for different gas species (from about 10% to 2–3%) which, in turn, will help to streamline the calibration of relative gas sensitivity factors for individual gauges and multiple gas species.

Its all about the detail

For starters, the consortium sought to identify and prioritise a set of high-level design parameters to underpin any future ISO-standardised gauge. A literature review of 260 relevant academic papers (from as far back as the 1950s) yielded some quick-wins and technical insights to inform subsequent simulations (using the commercial software packages OPERA and SIMION) of a v1.0 gauge design versus electrode positions, geometry and overall dimensions. Meanwhile, the partners carried out a statistical evaluation of the manufacturing tolerances for the electrode positions as well as a study of candidate electrode materials before settling on a “model gauge design” for further development.

“It’s a design that cannot be found on the market,” explains Jousten. “While somewhat risky, given that we can’t rely on prior experience with existing commercial products, the consortium took the view that the instabilities in current-generation gauges could not be overcome by modifying existing designs.” With a clear steer to rewrite the rulebook, VACOM and INFICON developed the technical drawings and produced 10 prototype gauges to be tested by NMI consortium members – a process that informed a further round of iteration and optimisation.

Better traceability for big-science vacuum measurements

Figure 1

The ionisation vacuum gauge is fundamental to the day-to-day work of the vacuum engineering teams at big-science laboratories like CERN. There’s commissioning of HV/UHV systems in the laboratory’s particle accelerators and detectors – monitoring of possible contamination or leaks between experimental runs of the LHC; pass/fail acceptance testing of vacuum components and subsystems prior to deployment; and a range of offline R&D activities, including low-temperature HV/UHV studies of advanced engineering materials.

“I see the primary use of the standardised gauge design in the testing of vacuum equipment and advanced materials prior to installation in the CERN accelerators,” explains Berthold Jenninger, a CERN vacuum specialist and the laboratory’s representative in the EMPIR 16NRM05 consortium. “The instrument will also provide an important reference to simplify the calibration of vacuum gauges and RGAs already deployed in our accelerator complex.”

The underlying issue is that commercial ionisation vacuum gauges are subject to significant drifts in their sensitivity during regular operation and handling – changes that are difficult to detect without access to an in-house calibration facility. Such facilities are the exception rather than the norm, however, given their significant overheads and the need for specialist metrology personnel to run them.

Owing to its stability, the EMPIR 16NRM05 gauge design promises to address this shortcoming by serving as a transfer reference for commercial ionisation vacuum gauges. “It will be possible to calibrate commercial vacuum gauges simply by comparing their readings with respect to that reference,” says Jenninger. “In this way, a research lab will get a clearer idea of the uncertainties of their gauges and, in turn, will be able to test and select the products best suited for their applications.”

The measurement of outgassing rate, pumping speed and vapour pressure at cryogenic temperatures will all benefit from the enhanced precision and traceability of the new-look gauge. Similarly, measurements of ionisation cross-section induced by electrons, ions or photons also rely on gas density measurement, so uncertainties in these properties will be reduced.

“Another bonus,” Jenninger notes, “will be enhanced traceability and comparability of vacuum measurements across different big-science facilities.”

“The results have been very encouraging,” explains Jousten. Specifically, the measured sensitivity of the latest model gauge design agrees with simulations, while the electron transmission through the ionisation region is close to 100%. As such, the electron path length is well-defined, and it can be expected that the relative sensitivities will relate exactly to the ionisation probabilities for different gases. For this reason, the fundamentals of the model gauge design are now largely fixed, with the only technical improvements in the works relating to robustness (for transport stability) and better electrical insulation between the gauge electrodes.

“Robustness appears fine, but is still under test at CMI [in the Czech Republic],” says Jousten. “Right now, the exchange of the emitting cathode – particularly its positioning – seems to depend a little too much on the skill of the technician, though this variability should be addressed by future industrial designs.”

Summarising progress as EMPIR 16NRM05 approaches the finishing line, Jousten points out that PTB and the consortium members originally set out to develop an ionisation vacuum gauge with good repeatability, reproducibility and transport robustness, so that relative sensitivity factors are consistent and can be accumulated over time for many gas species. “It seems that we have exceeded our target,” he explains, “since the sensitivity seems to be predictable for any gas for which the ionisation probability by electrons is known.” The variation of sensitivity for nitrogen between gauges appears to be < 5%, so that no calibration is necessary when the user is comfortable with that level of uncertainty. “At present,” Jousten concludes, “it looks like there is no need to calibrate the relative sensitivity factors, which represents enormous progress from the end-user perspective.”

Of course, much remains to be done. Jousten and his colleagues have already submitted a proposal to EURAMET for follow-on funding to develop the full ISO Technical Specification within the framework of ISO Technical Committee 112 (responsible for vacuum technology). In 2021, Covid permitting, the consortium members will then begin the hard graft of dissemination, presenting their new-look gauge design to manufacturers and end-users.

The post Vacuum metrology: made to measure appeared first on CERN Courier.

]]>
Feature A pan-European consortium is working towards an international standard for the commercial manufacture of ionisation vacuum gauges. https://cerncourier.com/wp-content/uploads/2021/01/CCSupp_2_Vac_2020_EURAMET-Illgen_feature.jpg
MEDICIS shows its strength https://cerncourier.com/a/medicis-shows-its-strength/ Fri, 18 Dec 2020 11:27:28 +0000 https://preview-courier.web.cern.ch/?p=90399 CERN’s MEDICIS facility is producing novel radioisotopes for medical research.

The post MEDICIS shows its strength appeared first on CERN Courier.

]]>
The robot target handler at MEDICIS

The use of radioisotopes to treat cancer goes back to the late 19th century. Great strides have been made, and today radioisotopes are widely used by the medical community. Produced mostly in reactors and cyclotrons, radioisotopes are used both to diagnose cancers and other diseases, such as heart irregularities, as well as to deliver very small radiation doses exactly where they are needed to avoid destroying the surrounding healthy tissue. 

However, many currently available isotopes do not combine the most appropriate physical and chemical properties and, in the case of certain tumours, a different type of radiation could be better suited. This is particularly true of the aggressive brain cancer glioblastoma multiforme and of pancreatic adenocarcinoma. Although external beam gamma radiation and chemotherapy can improve patient survival rates, there is a clear need for novel treatment modalities for these and other cancers.

On 12 December 2017, a new facility at CERN called MEDICIS produced its first radioisotopes for a batch of terbium (155Tb), which is part of a quadruplet of Tb isotopes  considered promising for both diagnosis and treatment. MEDICIS is designed to produce unconventional radioisotopes with the right properties to enhance the precision of patient imaging and treatment, and already it has expanded the range of radioisotopes available for research projects.

Initiated in 2010, MEDICIS is driven by CERN’s Isotope Mass Separator Online (ISOLDE) facility. ISOLDE has been running for more than 50 years, producing 1300 different isotopes from 73 chemicals for research in many areas including fundamental nuclear research, astrophysics and life sciences. The year 2020 marks the 40th anniversary of the first biomedical imaging studies at ISOLDE with 167Tm, and a record operation performance for MEDICIS of 50% mass purification yield – a number which is very rarely met.

Although ISOLDE already produces isotopes for medical research, MEDICIS is now able to regularly produce isotopes with specific types of emission or new purity grades, such as the pure beta-emitter 169Er or 153Sm produced in nuclear reactors. These were restricted to niche treatments before MEDICIS could physically purify them during its 2019 and 2020 harvesting campaigns to grades that make them suitable for a new form of personalised medicine: targeted radioimmunotherapy.

ISOLDE directs a high-intensity proton beam from the Proton Synchrotron Booster onto specially developed thick targets, yielding a large variety of atomic fragments. During proton-beam operation, MEDICIS works by placing a second target behind ISOLDE’s: once the isotopes have been produced on the MEDICIS target, an automated conveyor belt carries them to a facility where the radioisotopes of interest are extracted via mass separation and implanted in a metallic foil. The final product is then delivered to local research facilities including the Paul Scherrer Institute, the University Hospital of Vaud, Geneva University Hospitals, or other laboratories such as the UK’s National Physical Laboratory.

Clinical setting

Once in a medical-research environment, researchers dissolve the isotope and attach it to a molecule, such as a protein or sugar, which is chosen to target the tumour precisely. This makes the isotope injectable, and the molecule can then adhere to the tumour or organ that needs imaging or treating. The first isotopes selected by the MEDICIS collaboration board were first tested in vitro, and in vivo by using mouse models of cancer, opening new territories for researchers in radiopharmaceuticals and molecular oncology.

MEDICIS is not just a world-class facility for novel radioisotopes. It also marks the entrance of CERN into the growing field of theranostics, whereby physicians verify and quantify the presence of cellular and molecular targets in a given patient with a diagnostic radioisotope, before treating the disease with the therapeutic radioisotope. Together with local leading institutes in life and medical sciences and a large network of laboratories, MEDICIS’s exciting scientific programme and technological breakthroughs have triggered a new project supported by the European Commission – PRISMAP, the European medical isotope programme – starting in 2021. Though still young, MEDICIS is a prime example of how accelerators are set to play an increasing role in the production of life-changing medical isotopes.

The post MEDICIS shows its strength appeared first on CERN Courier.

]]>
Feature CERN’s MEDICIS facility is producing novel radioisotopes for medical research. https://cerncourier.com/wp-content/uploads/2020/12/CCSupp_1_Med_2020_MEDICIS_featured.jpg
Very high-energy electrons for cancer therapy https://cerncourier.com/a/very-high-energy-electrons-for-cancer-therapy/ Tue, 15 Dec 2020 21:09:49 +0000 https://preview-courier.web.cern.ch/?p=90271 The VHEE 2020 International Workshop saw more than 400 scientists gather virtually to evaluate the production of very high-energy electrons for radiotherapy.

The post Very high-energy electrons for cancer therapy appeared first on CERN Courier.

]]>
Dosimetry experiment for VHEE studies

Radiotherapy (RT) is a fundamental component of effective cancer treatment and control. More than 10,000 electron linear accelerators are currently used worldwide to treat patients with RT, most operating in the low beam-energy range of 5–15 MeV. Usually the electrons are directed at high-density targets to generate bremsstrahlung, and it is the resulting photon beams that are used for therapy. While low-energy electrons have been used to treat cancer for more than five decades, their very low penetration depth tends to limit their application to superficial tumours. The use of high-energy electrons (up to 50 MeV) was studied in the 1980s, but not clinically implemented.

More recently, the idea of using very high-energy (50–250 MeV) electron beams for RT has gained interest. For higher energy electrons, the penetration becomes deeper and the transverse penumbra sharper, potentially enabling the treatment of deep-seated tumours. While the longitudinal dose deposition is also distributed over a larger area, this can be controlled by focusing the electron beam.

The production of very high-energy electrons (VHEE) for RT was the subject of the VHEE 2020 International Workshop, organised by CERN and held remotely from 5–7 October. More than 400 scientists, ranging from clinicians to biologists, and from accelerator physicists to dosimetry experts, gathered virtually to evaluate the perspectives of this novel technique.

FLASH effect

VHEE beams offer several benefits. First, small-diameter high-energy beams can be scanned and focused easily, enabling finer resolution for intensity-modulated treatments than is possible for photon beams. Second, electron accelerators are more compact and significantly cheaper than current installations required for proton therapy. Third, VHEE beams can operate at very high dose rates, possibly compatible with the generation of the “FLASH effect”.

FLASH-RT is a paradigm-shifting method for delivering ultra-high doses within an extremely short irradiation time (tenths of a second). The technique has recently been shown to preserve normal tissue in various species and organs while still maintaining anti-tumour efficacy equivalent to conventional RT at the same dose level, in part due to decreased production of toxic reactive oxygen species. The FLASH effect has been shown to take place with electron, photon and more recently proton beams. However, electron beams promise to deliver an intrinsically higher dose compared to protons and photons, especially over large areas as would be needed for large tumours. Most of the preclinical data demonstrating the increased therapeutic index of FLASH are based on  a single fraction and hypo-fractionated regimen of RT and 4–6 MeV beams, which do not allow treatments of deep-seated tumours and trigger large lateral penumbra. This problem can be solved by increasing the electron energy to values higher than 50 MeV, where the penetration depth is larger.

Today, after three decades of research into linear colliders, it is possible to build compact high-gradient (~100 MV/m) linacs, making a compact and cost effective VHEE RT accelerator a reality. Furthermore, the use of novel accelerator techniques such as laser-plasma acceleration is also starting to be applied in the VHEE field. These are currently the subject of a wide international study, as was presented at the VHEE workshop.

At the same time pioneering preliminary work on FLASH was being carried out by researchers at Lausanne University Hospital (CHUV) in Switzerland and the Curie Institute in France, high-gradient linac technology advances for VHEE were being made at CERN for the proposed Compact Linear Collider (CLIC). An extensive R&D program on normal-conducting radio-frequency accelerating structures has been carried out to obtain the demanding performances of the CLIC linac: an accelerating gradient of 100 MV/m, low breakdown rate, micron-tolerance alignment and a high RF-to-beam efficiency (around 30%). All this is now being applied in the conceptual designs of new RT facilities, such as one jointly being developed by CHUV and CERN. 

Dose profile

High-energy challenges

Many challenges, both technological and biological, have to be addressed and overcome for the ultimate goal of using VHEE and VHEE-FLASH as an innovative modality for effective cancer treatment with minimal damage to healthy tissues. All of these were extensively covered and discussed in the different sessions of VHEE 2020.

From the accelerator-technology point of view an important point is to assess the possibility of focusing and transversely scanning the beam, thereby overcoming the disadvantages associated in the past with low-energy-electron- and photon-beam irradiation. In particular, in the case of VHEE–FLASH it has to be ensured that the biological effect is maintained. Stability, reliability and repeatability are other mandatory ingredients for accelerators to be operated in a medical environment.

The major challenge for VHEE–FLASH is the delivery of a very high dose-rate, possibly over a large area, providing a uniform dose distribution throughout the target. Also the parameter window in which the FLASH effect takes place has still to be thoroughly defined, as does its effectiveness as a function of the physical parameters of the electron beam. This, together with a clear understanding of the underlying biological processes, will likely prove essential in order to fully optimise the FLASH RT technique. Of particular importance, as was repeatedly pointed out during the workshop, is the development of reliable online dosimetry for very high dose rates, a regime not adapted to the current standard dosimetry techniques for RT. Ionisation chambers, routinely used in medical linacs, suffer from nonlinear effects at very high dose rates. To obtain reliable measurements, R&D is needed to develop novel ion chambers or explore alternative possibilities such as solid-state detectors or the use of calibrated beam diagnostics.

All this demands a large test activity across different laboratories to experimentally characterise VHEE beams and their ability to produce the FLASH effect, and to provide a testbed for the associated technologies. It is also important to compare the properties of the electron beams depending on the way they are produced (radio-frequency or laser-plasma accelerator technologies). 

A number of experimental test facilities are already available to perform these ambitious objectives: the CERN Linear Electron Accelerator for Research (CLEAR), so far rather unique in being able to provide both high-energy (50–250 MeV) and high-charge beams; VELA–CLARA at Daresbury Laboratory; PITZ at DESY and finally ELBE–HZDR using the superconducting radio-frequency technology at Dresden. Further radiobiology studies with laser-plasma accelerated electron beams are currently being performed at the DRACO PetaWatt laser facility at the ELBE Center at HZDR-Dresden and at the Laboratoire d’Optique Appliqué in the Institute Polytechnique de Paris. Future facilities, as exemplified by the previously mentioned CERN–CHUV facility or the PHASER proposal at SLAC, are also on the horizon.

Establishing innovative treatment modalities for cancer is a major 21st century health challenge. By 2040, cancer is predicted to be the leading cause of death, with approximatively 27.5 million newly diagnosed patients and 16.3 million related deaths per year. The October VHEE workshop demonstrated the continuing potential of accelerator physics to drive new RT treatments, and also included a lively session dedicated to industrial partners. The large increase in attendance since the first workshop in 2017 in Daresbury, UK, shows the vitality and increasing interest in this field.

The post Very high-energy electrons for cancer therapy appeared first on CERN Courier.

]]>
Meeting report The VHEE 2020 International Workshop saw more than 400 scientists gather virtually to evaluate the production of very high-energy electrons for radiotherapy. https://cerncourier.com/wp-content/uploads/2020/12/CCSupp_1_Med_2020_VHEE_frontis-b_v2.jpg
CERN takes next step for hadron therapy https://cerncourier.com/a/cern-takes-next-step-for-hadron-therapy/ Tue, 15 Dec 2020 21:03:01 +0000 https://preview-courier.web.cern.ch/?p=90232 The Next Ion Medical Machine Study ("NIMMS") aims to design a new generation of light-ion accelerators for medicine.

The post CERN takes next step for hadron therapy appeared first on CERN Courier.

]]>
SEEIIST Ion Therapy Research Infrastructure

Twenty years ago, pioneering work at CERN helped propel Europe to the forefront of cancer treatment with hadron beams. The Proton Ion Medical Machine Study (PIMMS), founded in 1996 by a CERN–TERA Foundation-MedAustron–Oncology2000 collaboration, paved the way to the construction of two hadron-therapy centres: CNAO in Pavia (Italy) and MedAustron in Wiener Neustadt (Austria). A parallel pioneering development at GSI produced two similar centres in Germany (HIT in Heidelberg and MIT in Marburg). Since the commissioning of the first facility in 2009, the four European hadron-therapy centres have treated more than 10,000 patients with protons or carbon ions. The improved health and life expectancy of these individuals is the best reward to the vision of all those at CERN and GSI who laid the foundations for this new type of cancer treatment.

Almost four million new cancer cases are diagnosed per year in Europe, around half of which can be effectively treated with X-rays at relatively low cost. Where hadrons are advantageous is in the treatment of deep tumours close to critical organs or of paediatric tumours. For these cancers, the “Bragg peak” energy-deposition characteristic of charged particles reduces the radiation dose to organs surrounding the tumour, increasing survival rates and reducing negative side effects and the risk of recurrency. With respect to protons, carbon ions have the additional advantages of hitting the target more precisely with higher biological effect, and of being effective against radioresistant hypoxic tumours, which constitute between 1 and 3% of all radiation-therapy cases. Present facilities treat only a small fraction of all patients who could take advantage of hadron therapy, however. The diffusion of this relatively novel cancer treatment is primarily limited by its cost, and by the need for more pre-clinical and clinical research to fully exploit its potential.

Given these limitations, how can the scientific community contribute to extending the benefits of hadron therapy to a larger number of cancer patients? To review this and similar questions, CERN has recently given a new boost to its medical accelerator activities, after a long interruption corresponding to the time when CERN resources where directed mainly towards LHC construction. The framework for this renewed effort was provided by the CERN Council in 2017 when it approved a strategy concerning knowledge-transfer for the benefit of medical applications. This strategy specifically encouraged new initiatives to leverage existing and upcoming CERN technologies and expertise in accelerator technologies towards the design of a new generation of light-ion accelerators for medicine.

“canted-cosine-theta” coils

The hadron-therapy landscape in 2020 is very different from what it was 20 years ago. The principal reason is that industry has entered the field and developed a new generation of compact cyclotrons for proton therapy. Beyond the four hadron (proton and ion) centres there are now 23 industry-built facilities in Europe providing only proton therapy to about 4000 patients per year. Thanks to this new set of facilities, proton therapy is now highly developed and is progressively extending its reach in competition with more conventional X-ray radiation therapy.

Despite its many advantages over X-rays and protons, therapy with ions (mainly carbon, but other ions like helium or oxygen are under study) is still administered in Europe only by the four large hadron-therapy facilities. In comparison, eight ion-therapy accelerators are in operation in Asia, most of them in Japan, and four others are under construction. The development of new specific instruments for cancer therapy with ions is an ideal application for CERN technologies, in line with CERN’s role of promoting the adoption of cutting-edge technologies that might result in innovative products and open new markets.

Next-generation accelerators

To propel the use of cancer therapy with ions we need a next-generation accelerator, capable of bringing beams of carbon ions to the 430 MeV/u energy required to cover the full body, with smaller dimensions and cost compared to the PIMMS-type machines. A new accelerator design with improved intensity and operational flexibility would also enable a wide research programme to optimise ion species and treatment modalities, in line with what was foreseen by the cancelled BioLEIR programme at CERN. This would allow the exploration of innovative paths to the treatment of cancer such as ultra-short FLASH therapy or the promising combination of ion therapy with immunotherapy, which is expected to trigger an immune response against diffused cancers and metastasis. Moreover, a more compact accelerator could be installed in, or very close to, existing hospitals to fully integrate ion therapy in cancer-treatment protocols while minimising the need to transport patients over long distances.

The development of new specific instruments for cancer therapy with ions is an ideal application for CERN technologies

These considerations are the foundation for the Next Ion Medical Machine Study (NIMMS), a new CERN initiative that aims to develop specific accelerator technologies for the next generation of ion-therapy facilities and help catalyse a new European collective action for therapy with ion beams. The NIMMS activities were launched in 2019, following a workshop at ESI Archamps in 2018 where the medical and accelerator communities agreed on basic specifications for a new-generation machine. In addition to smaller dimensions and cost, these include a higher beam current for faster treatment, operation with multiple ions, and irradiation from different angles using a gantry system.

In addressing the challenges of new designs with reduced dimensions, CERN is building on the development work promoted in the last decade by the TERA Foundation. Reducing the accelerator dimensions from the conventional synchrotrons used so far can take different directions, out of which two are particularly promising. The first is the classic approach of using superconductivity to increase the magnetic field and decrease the radius of the synchrotron, and the second consists of replacing the synchrotron with a high-gradient linear accelerator with a new design – in line with the proton therapy linac being developed by ADAM, a spin-off company of CERN and TERA now part of the AVO group. The goal in both designs is to reduce the surface occupied by the accelerator by more than a factor of two, from about 1200 to 500 m2. With these considerations in mind, the NIMMS study has been structured in four work packages.

The main avenue to reduced dimensions is superconductivity, and the goal of the first work package is to develop new superconducting magnet designs for pulsed operation, with large apertures and curvatures – suitable for an ideal “square” synchrotron layout with only four 90 degree magnets. Different concepts are being explored, with some attention to the so-called canted cosine-theta design (see “Combined windings”) used for example in orbit correctors for the high-luminosity LHC, of which a team at Lawrence Berkeley National Laboratory has recently developed a curved prototype for medical applications. Other options under study are based on more traditional cosine-theta designs (see “Split yoke”), and on exploiting the potential of modern high-temperature superconductors. 

curved cosine-theta dipole

The second work package covers the design of a compact linear accelerator optimised for installation in hospitals. Operating at 3 GHz with high field gradients, this linac design profits from the expertise gained with accelerating structures developed for the proposed Compact Linear Collider (CLIC), and uses as an injector a novel source for fully-stripped carbon based on the REX-ISOLDE design. The source is followed by a 750 MHz radio-frequency quadrupole using the design recently developed at CERN for medical and industrial applications.

The third NIMMS work package focuses on compact superconducting designs for the gantry, the large element required to precisely deliver ion beams to the patient that is critical for the cost and performance of an ion-therapy facility. The problem of integrating a large-acceptance beam optics with a compact superconducting magnetic system within a robust mechanical structure is an ideal challenge for the expertise of the CERN accelerator groups. Two designs are being considered: a lightweight rotational gantry covering only 180 degrees originally proposed by TERA, and the GaToroid toroidal gantry being developed at CERN.

NIMMS will consider new designs for the injector linac, with reduced cost and dimensions

The fourth work package is dedicated to the development of new high-current synchrotron designs, and to their integration in future cancer research and therapy facilities. To reduce treatment time, the goal is to accelerate more than an order of magnitude higher current than in the present European facilities. This requires careful multi-turn injection into the ring and strict control of beam optics, which add to other specific features of the new design, including a fast extraction that will make tests with the new ultra-fast FLASH treatment modality possible. Two synchrotron layouts are being considered, a more conventional one with room-temperature magnets (see “Ions for therapy”), and a very compact superconducting one of only 27 m circumference. The latter, equipped with a gantry of new design, would allow a single-room carbon-therapy facility to be realised in an area of about 1000 m2. Additionally, NIMMS will consider new designs for the injector linac, with reduced cost and dimensions and including the option of being used for production of medical radioisotopes – for imaging and therapy – during the otherwise idle time between two synchrotron injections.

Ambitious work plan

This ambitious work plan exceeds the resources that CERN can allocate to this study, and its development requires collaborations at different levels. The first enthusiastic partner is the new SEEIIST (South East European International Institute for Sustainable Technologies) organisation, which aims at building a pan-European facility for cancer research and therapy with ions (see “Ions for therapy”). SEEIIST is already joining forces with NIMMS by supporting staff working at CERN on synchrotron and gantry design. The second partnership is with the ion therapy centres CNAO and MedAustron, which are evaluating the proposed superconducting gantry design in view of extending the treatment capabilities of their facilities. A third critical partner is CIEMAT, which will build the high-frequency linac pre-injector and validate it with beam. Other partners participating in the study at different levels are GSI, PSI, HIT, INFN, Melbourne University, Imperial College, and of course TERA which remains one of the driving forces behind medical-accelerator developments. This wide collaboration has been successful in attracting additional support from the European Commission via two recently approved projects beginning in 2021. The multidisciplinary HITRIplus project on ion therapy includes work packages dedicated to accelerator, gantry and superconducting magnet design, while the IFAST project for cutting-edge accelerator R&D contains an ambitious programme focusing on the optimisation and prototyping of superconducting magnets for ion therapy with industry.

Every technology starts from a dream, and particle accelerators are there to fulfil one of the oldest: looking inside the human body and curing it without bloodshed. It is up to us to further develop the tools to realise this dream.

The post CERN takes next step for hadron therapy appeared first on CERN Courier.

]]>
Feature The Next Ion Medical Machine Study ("NIMMS") aims to design a new generation of light-ion accelerators for medicine. https://cerncourier.com/wp-content/uploads/2020/12/CCSupp_1_Med_2020_NIMMS_fig1panel1.jpg
Adapting CLIC tech for FLASH therapy https://cerncourier.com/a/adapting-clic-tech-for-flash-therapy/ Tue, 15 Dec 2020 20:32:46 +0000 https://preview-courier.web.cern.ch/?p=90252 A collaboration between CERN and Lausanne University Hospital will see technology developed for the proposed Compact Linear Collider drive a novel cancer radiotherapy facility.

The post Adapting CLIC tech for FLASH therapy appeared first on CERN Courier.

]]>
Walter Wuensch

About 30–40% of people will develop cancer during their lifetimes. Surgery, chemotherapy, immunotherapy and radiotherapy (RT) are used to cure or manage the disease. But around a third of cancers are multi-resistant to all forms of therapies, defining a need for more efficient and better tolerated treatments. Technological advances in the past decade or so have transformed RT into a precise and powerful treatment for cancer patients. Nevertheless, the treatment of radiation-resistant tumours is complicated by the need to limit doses to surrounding normal tissue.

A paradigm-shifting technique called FLASH therapy, which is able to deliver doses of radiation in milliseconds instead of minutes as for conventional RT, is opening new avenues for more effective and less toxic RT. Pre-clinical studies have shown that the extremely short exposure time of FLASH therapy spares healthy tissue from the hazardous effect of radiation without reducing its efficacy on tumours.

First studied in the 1970s, it is only during the past few years that FLASH therapy has caught the attention of oncologists. The catalyst was a 2014 study carried out by researchers from Lausanne University Hospital (CHUV), Switzerland, and from the Institute Curie in Paris, which showed an outstanding differential FLASH effect between tumours and normal tissues in mice. The results were later confirmed by several other leading institutes. Then, in 2019, CHUV used FLASH to treat a multi-resistant skin cancer in a human patient, causing the tumour to completely disappear with nearly no side effects.

The consistency of pre-clinical data showing a striking protection of normal tissues with FLASH compared to conventional RT offers a new opportunity to improve cancer treatment, especially for multi-resistant tumours. The very short “radiation beam-on-time” of FLASH therapy could also eliminate the need for motion management, which is currently necessary when irradiating tumours that move with respiration. Furthermore, since FLASH therapy operates best with high single doses, it requires only one or two RT sessions as opposed to multiple sessions over a period of several weeks in the case of conventional RT. This promises to reduce oncology workloads and patient waiting lists, while improving treatment access in low-population density environments. Altogether, these advantages could turn FLASH therapy into a powerful new tool for cancer treatment, providing a better quality of life for patients.

The key requirements for CLIC correspond astonishingly well with the requirements for a FLASH facility

CERN and CHUV join forces

CHUV is undertaking a comprehensive research program to translate FLASH therapy to a clinical environment. No clinical prototype is currently available for treating patients with FLASH therapy, especially for deep-seated tumours. Such treatments require very high-energy beams (see p12) and face technological challenges that can currently be solved only by a very limited number of institutions worldwide. As the world’s largest particle-physics laboratory, CERN is one of them. In 2019, CHUV and CERN joined forces with the aim of building a high-energy, clinical FLASH facility.

The need to deliver a full treatment dose over a large area in a short period of time demands an accelerator that can produce a high-intensity beam. Amongst the current radiation tools available for RT – X-rays, electrons, protons and ions – electrons stand out for their unique combination of attributes. Electrons with an energy of around 100 MeV penetrate many tens of centimetres in tissue so have the potential to reach tumours deep inside the body. This is also true for the other radiation modalities but it is technically simpler to produce intense beams of electrons. For example, electron beams are routinely used to produce X-rays in imaging systems such as CT scanners and in industrial applications such as electron beam-welding machines. In addition, it is comparatively simple to accelerate electrons in linear accelerators and guide them using modest magnets. A FLASH-therapy facility based on 100 MeV-range electrons is therefore a highly compelling option.

Demonstrating the unexpected practical benefits of fundamental research, the emergence of FLASH therapy as a potentially major clinical advance coincides with the maturing of accelerator technology developed for the CLIC electron–positron collider. In a further coincidence, the focus of FLASH development has been at CHUV, in Lausanne, and CLIC development at CERN, in Geneva, just 60 km away. CLIC is one of the potential options for a post-LHC collider and the design of the facility, as well as the development of key technologies, has been underway for more than 20 years. A recent update of the design, now optimized for a 380 GeV initial-energy stage, and updated prototype testing were completed in 2018.

Despite the differences in scale and application, the key requirements for CLIC correspond astonishingly well with the requirements for a FLASH facility. First, CLIC requires high-luminosity collisions, for example to allow the study of rare interaction processes. This is achieved by colliding very high-intensity and precisely controlled beams: the average current during a pulse of CLIC is 1 A and the linac hardware is designed to allow two beams less than 1 nm in diameter to collide at the interaction point. High levels of current that are superbly controlled are also needed for FLASH to cover large tumours in short times. Second, CLIC requires a high accelerating gradient (72 MV/m in the initial stage) to achieve its required collision energy in a reasonably sized facility (11 km for a 380 GeV first stage). A FLASH facility using 100 MeV electrons based on an optimised implementation of the same technology requires an accelerator of just a couple of metres long. Other system elements such as diagnostics, beam shaping and delivery as well as radiation shielding make the footprint of the full facility somewhat larger. Overall, however, the compact accelerator technology developed for CLIC gives the possibility of clinical facilities built within the confines of typical hospital campus and integrated with existing oncology departments.

Over the decades, CLIC has invested significant resources into developing its high-current and high-gradient technology. Numerous high-power radio-frequency test stands have been built and operated, serving as prototypes for the radio-frequency system units that make up a linear accelerator. The high-current-beam test accelerator “CTF3” enabled beam dynamic simulation codes to be benchmarked and the formation, manipulation and control of very intense electron beams to be demonstrated. Further beam-dynamics validations and relevant experiments have been carried out at different laboratories including ATF2 at KEK, FACET at SLAC and ATF at Argonne. CERN also operates the Linear Electron Accelerator for Research (CLEAR) facility, where it can accelerate electrons up to 250 MeV, thus matching the energy requirements of FLASH radiotherapy. For the past several years, and beyond the collaboration between CERN and CHUV, the CLEAR facility has been involved in dosimetry studies for FLASH radiotherapy. 

Towards a clinical facility

All of this accumulated experience and expertise is now being used to design and construct a FLASH facility. The collaboration between CERN and CHUV is a shining example of knowledge transfer, where technology developed for fundamental research is used to develop a therapeutic facility. While the technical aspects of the project have been defined via exchanges between medical researchers and accelerator experts, the CERN knowledge-transfer group and CHUV’s management have addressed contractual aspects and identified a strategy for intellectual property ownership. This global approach provides a clear roadmap for transforming the conceptual facility into a clinical reality. From the perspective of high-energy physics, the adoption of CLIC technology in commercially supplied medical facilities would significantly reduce technological risk and increase the industrial supplier base. 

An interdisciplinary team comprising medical doctors, medical physicists, radiation biologists and accelerator physicists and engineers was formed

The collaboration between CHUV and CERN was catalysed by a workshop on FLASH therapy hosted by CHUV in September 2018, when it was realised that an electron-beam facility based on CLIC technology offers the possibility for a high-performance clinical FLASH facility. An interdisciplinary team comprising medical doctors, medical physicists, radiation biologists and accelerator physicists and engineers was formed to study the possibilities in greater depth. In an intense exchange during the months following the workshop, where requirements and capabilities were brought together and balanced, a clear picture of the parameters of a clinical FLASH facility emerged. Subsequently, the team studied critical issues in detail, validating that such a facility is in fact feasible. It is now working towards the details of a baseline design, with parameters specified at the system level, and the implementation of entirely new perspectives that were triggered by the study. A conceptual design report for the facility will be finished by the end of 2020. CHUV is actively seeking funding for the facility, which would require approximately three years for construction through beam commissioning.

The basic accelerator elements of the 100 MeV-range FLASH facility that emerged from this design process consist of: a photo-injector electron source; a linac optimised for high-current transport and maximum radio-frequency-power to beam-energy-transfer efficiency; and a beam-delivery system which forms the beam shape for individual treatment and directs it towards the patient. In addition, accelerator and clinical instrumentation are being designed which must work together to provide the necessary level of precision and repeatability required for patient treatment. This latter issue is of particular criticality in FLASH treatment, which must be administered with all feedback and correction of delivered dose to clinical levels completed in substantially less than a second. The radiation field is one area where the requirements of CLIC and FLASH are quite different. In CLIC the beam is focused to a very small spot (roughly 150 nm wide and 3 nm high) for maximum luminosity, whereas in FLASH the beam must be expanded to cover a large area (up to 10 cm) of irregular cross section and with high levels of dose uniformity. Although this requires a very different implementation of the beam-delivery systems, both CLIC and FLASH are designed using the same beam-dynamics tools and design methodologies. 

Many challenges will have to be overcome, not least obtaining regulatory approval for such a novel system, but we are convinced that the fundamental ideas are sound and that the goal is within reach. A clinical FLASH facility based on CLIC technology is set to be an excellent example of the impact of developments made in the pursuit of fundamental science can have in society.

The post Adapting CLIC tech for FLASH therapy appeared first on CERN Courier.

]]>
Feature A collaboration between CERN and Lausanne University Hospital will see technology developed for the proposed Compact Linear Collider drive a novel cancer radiotherapy facility. https://cerncourier.com/wp-content/uploads/2020/12/CCSupp_1_Med_2020_FLASH-202008-108_16.jpg
Beating cardiac arrhythmia https://cerncourier.com/a/beating-cardiac-arrhythmia/ Wed, 18 Nov 2020 09:29:44 +0000 https://preview-courier.web.cern.ch/?p=89999 Adriano Garonna co-founded EBAMed, a company which develops technologies to enable non-invasive treatments of heart arrhythmia using proton beams.

The post Beating cardiac arrhythmia appeared first on CERN Courier.

]]>
EBAMed’s technical team

In December last year, a beam of protons was used to treat a patient with cardiac arrhythmia – an irregular beating of the heart that affects around 15 million people in Europe and North America alone. The successful procedure, performed at the National Center of Oncological Hadrontherapy (CNAO) in Italy, signalled a new application of proton therapy, which has been used to treat upwards of 170,000 cancer patients worldwide since the early 1990s.

In parallel to CNAO – which is based on accelerator technologies developed in conjunction with CERN via the TERA Foundation – a Geneva-based start-up called EBAMed (External Beam Ablation) founded by CERN alumnus Adriano Garonna aims to develop and commercialise image-guidance solutions for non-invasive treatments of heart arrhythmias. EBAMed’s technology is centred on an ultrasound imaging system that monitors a patient’s heart activity, interprets the motion in real time and sends a signal to the proton-therapy machine when the radiation should be sent. Once targeted, the proton beam ablates specific heart tissues to stop the local conduction of disrupted electrical signals.

Fast learner

“Our challenge was to find a solution using the precision of proton therapy on a fast and irregular moving target: the heart,” explains Garonna. “The device senses motion at a very fast rate, and we use machine learning to interpret the images in real time, which allows robust decision-making.” Unlike current treatments, which can be lengthy and costly, he adds, people can be treated as outpatients; the intervention is non-invasive and “completely pain-free”.

The recipient of several awards – including TOP 100 Swiss Startups 2019, Venture Business Plan 2018, MassChallenge 2018, Venture Kick 2018 and IMD 2017 Start-up Competition – EBAMed recently received a €2.4 million grant from the European Union to fund product development and the first human tests.

Garonna’s professional journey began when he was a summer student at CERN in 2007, working on user-interface software for a new optical position-monitoring system at LHC Point 5 (CMS). Following his graduation, Garonna returned to CERN as a PhD student with the TERA Foundation and École Polytechnique Fédérale de Lausanne, and then as a fellow working for the Marie Curie programme PARTNER, a training network for European radiotherapy. This led to a position as head of therapy accelerator commissioning at MedAustron in Austria – a facility for proton and ion therapy based, like CNAO, on TERA Foundation/CERN technology. After helping deliver the first patient treatments at MedAustron, Garonna returned to CERN and entered informal discussions with TERA founder Ugo Amaldi, who was one of Garonna’s PhD supervisors, about how to take the technology further. Along with former CERN engineer Giovanni Leo and arrhythmia expert Douglas Packer, the group founded EBAMed in 2018.

“Becoming an entrepreneur was not my initial purpose, but I was fascinated by the project and convinced that a start-up was the best vehicle to bring it to market,” says Garonna. Not having a business background, he benefitted from the CERN Knowledge Transfer entrepreneurship seminars as well as the support from the Geneva incubator Fongit and courses organised by Innosuisse, the Swiss innovation agency. Garonna also drew on previous experience gained while at CERN. “At CERN most of my projects involved exploring new areas. While I benefitted from the support of my supervisors, I had to drive projects on my own, seek the right solutions and build the appropriate ecosystem to obtain results. This certainly developed an initiative-driven, entrepreneurial streak in me.”

Healthy competition

Proton therapy is booming, with almost 100 facilities operating worldwide and more than 35 under construction. EBAMed’s equipment can be installed in any proton-therapy centre irrespective of its technology, says Garonna. “We already have prospective patients contacting us as they have heard of our device and wish to benefit from the treatment. As a company, we want to be the leaders in our field. We do have a US competitor, who has developed a planning system using conventional radiotherapy, and we are grateful that there is another player on the market as it helps pave the way to non-invasive treatments. Additionally, it is dangerous to be alone, as that could imply that there is no market in the first place.”

Leaving the security of a job to risk it all with a start-up is a gradual process, says Garonna. “It’s definitely challenging to jump into what seems like cold water… you have to think if it is worth the journey. If you believe in what you are doing, I think it will be worth it.”

The post Beating cardiac arrhythmia appeared first on CERN Courier.

]]>
Careers Adriano Garonna co-founded EBAMed, a company which develops technologies to enable non-invasive treatments of heart arrhythmia using proton beams. https://cerncourier.com/wp-content/uploads/2020/11/CCNovDec20_Careers_frontis.jpg
Neutrinos for peace https://cerncourier.com/a/neutrinos-for-peace/ Tue, 10 Nov 2020 18:12:00 +0000 https://preview-courier.web.cern.ch/?p=89440 Detectors similar to those used to hunt for sterile neutrinos could help guard against the extraction of plutonium-239 for nuclear weapons, writes Patrick Huber.

The post Neutrinos for peace appeared first on CERN Courier.

]]>
The PROSPECT neutrino detector

The first nuclear-weapons test shook the desert in New Mexico 75 years ago. Weeks later, Hiroshima and Nagasaki were obliterated. So far, these two Japanese cities have been the only ones to suffer such a fate. Neutrinos can help to ensure that no other city has to be added to this dreadful list.

At the height of the arms race between the US and the USSR, stockpiles of nuclear weapons exceeded 50,000 warheads, with the majority being thermonuclear designs vastly more destructive than the fission bombs used in World War II. Significant reductions in global nuclear stockpiles followed the end of the Cold War, but the US and Russia still have about 12,500 nuclear weapons in total, and the other seven nuclear-armed nations have about 1500. Today, the politics of non-proliferation is once again tense and unpredictable. New nuclear security challenges have appeared, often from unexpected actors, as a result of leadership changes on both sides of the table. Nuclear arms races and the dissolution of arms-control treaties have yet again become a real possibility. A regional nuclear war involving just 1% of the global arsenal would cause a massive loss of life, trigger climate effects leading to crop failures and jeopardise the food supply of a billion people. Until we achieve global disarmament, nuclear non-proliferation efforts and arms control are still the most effective tools for nuclear security.

Not a bang but a whimper

The story of the neutrino is closely tied to nuclear weapons. The first serious proposal to detect the particle hypothesised by Pauli, put forward by Clyde Cowan and Frederick Reines in the early 1950s, was to use a nuclear explosion as the source (see “Daring experiment” figure). Inverse beta decay, whereby an electron-antineutrino strikes a free proton and transforms it into a neutron and a positron, was to be the detection reaction. The proposal was approved in 1952 as an addition to an already planned atmospheric nuclear-weapons test. However, while preparing for this experiment, Cowan and Reines realised that by capturing the neutron on a cadmium nucleus, and observing the delayed coincidence between the positron and this neutron, they could use the lower, but steady flux of neutrinos from a nuclear reactor instead (see “First detection” figure). This technique is still used today, but with gadolinium or lithium in place of cadmium.

Proposal to discover particles using a nuclear explosion

The P reactor at the Savannah River site at Oak Ridge National Laboratory, which had been built and used to make plutonium and tritium for nuclear weapons, eventually hosted the successful experiment to first detect the neutrino in 1956. Neutrino experiments testing the properties of the neutrino including oscillation searches continued there until 1988, when the P reactor was shut down.

Neutrinos are not produced in nuclear fission itself, but by the beta decays of neutron-rich fission fragments – on average about six per fission. In a typical reactor fuelled by natural uranium or low-enriched uranium, the reactor starts out with only uranium-235 as its fuel. During operation a significant number of neutrons are absorbed on uranium-238, which is far more abundant, leading to the formation of uranium-239, which after two beta decays becomes plutonium-239. Plutonium-239 eventually contributes to about 40% of the fissions, and hence energy production, in a commercial reactor. It is also the isotope used in nuclear weapons.

The dual-use nature of reactors is at the crux of nuclear non-proliferation. What distinguishes a plutonium-production reactor from a regular reactor producing electricity is whether it is operated in such a way that the plutonium can be taken out of the reactor core before it deteriorates and becomes difficult to use in weapons applications. A reactor with a low content of plutonium-239 makes more and higher energy neutrinos than one rich in plutonium-239.

Lev Mikaelyan and Alexander Borovoi, from the Kurchatov Institute in Moscow, realised that neutrino emissions can be used to infer the power and plutonium content of a reactor. In a series of trailblazing experiments at the Rovno nuclear power plant in the 1980s and early 1990s, their group demonstrated that a tonne-scale underground neutrino detector situated 10 to 20 metres from a reactor can indeed track its power and plutonium content.

The significant drawback of neutrino detectors in the 1980s was that they needed to be situated underground, beneath a substantial overburden of rock, to shield them from cosmic rays. This greatly limited potential deployment sites. There was a series of application-related experiments – notably the successful SONGS experiment conducted by researchers at Lawrence Livermore National Laboratory, which aimed to reduce cost and improve the robustness and remote operation of neutrino detectors – but all of these detectors still needed shielding.

From cadmium to gadolinium

Synergies with fundamental physics grew in the 1990s, when the evidence for neutrino oscillations was becoming impossible to ignore. With the range of potential oscillations frequencies narrowing, the Palo Verde and Chooz reactor experiments placed multi-tonne detectors about 1 km from nuclear reactors, and sought to measure the relatively small θ13 parameter of the neutrino mixing matrix, which expresses the mixing between electron neutrinos and the third neutrino mass eigenstate. Both experiments used large amounts of liquid organic scintillator doped with gadolinium. The goal was to tag antineutrino events by capturing the neutrons on gadolinium, rather than the cadmium used by Reines and Cowan. Gadolinium produces 8 MeV of gamma rays upon de-excitation after a neutron capture. As it has an enormous neutron-capture cross section, even small amounts greatly enhance an experiment’s ability to identify neutrons.

Delayed coincidence detection scheme

Eventually, neutrino oscillations became an accepted fact, redoubling the interest in measuring θ13. This resulted in three new experiments: Double Chooz in France, RENO in South Korea, and Daya Bay in China. Learning lessons from Palo Verde and Chooz, the experiments successfully measured θ13 more precisely than any other neutrino mixing parameter. A spin-off from the Double Chooz experiment was the Nucifer detector (see “Purpose driven” figure), which demonstrated the operation of a robust sub-tonne-scale detector designed with missions to monitor reactors in mind, in alignment with requirements formulated at a 2008 workshop held by the International Atomic Energy Agency (IAEA). However, Nucifer still needed a significant overburden.

In 2011, however, shortly before the experiments established that θ13 is not zero, fundamental research once again galvanised the development of detector technology for reactor monitoring. In the run-up to the Double Chooz experiment, a group at Saclay started to re-evaluate the predictions for reactor neutrino fluxes – then and now based on measurements at the Institut Laue-Langevin in the 1980s – and found to their surprise that the reactor flux prediction came out 6% higher than before. Given that all prior experiments were in agreement with the old flux predictions, neutrinos were missing. This “reactor-antineutrino anomaly” persists to this day. A sterile neutrino with a mass of about 1 eV would be a simple explanation. This mass range has been suggested by experiments with accelerator neutrinos, most notably LSND and MiniBooNE, though it conflicts with predictions that muon neutrinos should oscillate into such a sterile neutrino, which experiments such as MINOS+ have failed to confirm.

To directly observe the high-frequency oscillations of an eV-scale sterile neutrino you need to get within about 10 m of the reactor. At this distance, backgrounds from the operation of the reactor are often non-negligible, and no overburden is possible – the same conditions a detector on a safeguards mission would encounter.

From gadolinium to lithium

Around half a dozen experimental groups are chasing sterile neutrinos using small detectors close to reactors. Some of the most advanced designs use fine spatial segmentation to reject backgrounds, and replace gadolinium with lithium-6 as the nucleus to capture and tag neutrons. Lithium has the advantage that upon neutron capture it produces an alpha particle and a triton rather than a handful of photons, resulting in a very well localised tag. In a small detector this improves event containment and thus efficiency, and also helps constrain event topology.

Following the lithium and finely segmented technical paths, the PROSPECT collaboration and the CHANDLER collaboration (see “Rapid deployment” figure), in which I participate, independently reported the detection of a neutrino spectrum with minimal overburden and high detection efficiency in 2018. This is a major milestone in making non-proliferation applications a reality, since it is the first demonstration of the technology needed for tonne-scale detectors capable of monitoring the plutonium content of a nuclear reactor that could be universally deployed without the need for special site preparation.

The story of the neutrino is closely tied to nuclear weapons

The main difference between the two detectors is that PROSPECT, which reported its near-final sterile neutrino limit at the Neutrino 2020 conference, uses a traditional approach with liquid scintillator, whereas CHANDLER, currently an R&D project, uses plastic scintillator. The use of plastic scintillator allows the deployment time-frame to be shortened to less than 24 hours. On the other hand, liquid scintillator allows the exploitation of pulse-shape discrimination to reject cosmic-ray neutron backgrounds, allowing PROSPECT to achieve a much better signal-to-background ratio than any plastic detector to date. Active R&D is seeking to improve topological reconstruction in plastic detectors and imbue them with pulse-shape discrimination. In addition, a number of safeguard-specific detector R&D experiments have successfully detected reactor neutrinos using plastic scintillator in conjunction with gadolinium. In the UK, the VIDARR collaboration has seen neutrinos from the Wylfa reactor, and in Japan the PANDA collaboration successfully operated a truck-mounted detector.

In parallel to detector development, studies are being undertaken to understand how reactor monitoring with neutrinos would impact nuclear security and support non-proliferation objectives. Two very relevant situations being studied are the 2015 Iran Deal – the Joint Comprehensive Plan of Action (JCPOA) – and verification concepts for a future agreement with North Korea.

Nuclear diplomacy

One of the sticking points in negotiating the 2015 Iran deal was the future of the IR-40 reactor, which was being constructed at Arak, an industrial city in central Iran. The IR-40 was planned to be a 40 MW reactor fuelled by natural uranium and moderated with heavy water, with a stated purpose of isotope production for medical and scientific use. The choice of fuel and moderator is interesting, as it meshes with Iranian capabilities and would serve the stated purpose well and be cost effective, since no uranium enrichment is needed. Equally, however, if one were to design a plutonium-production reactor for a nascent weapons programme, this combination would be one of the top choices: it does not require uranium enrichment, and with the stated reactor power would result in the annual production of about 10 kg of rather pure plutonium-239. This matches the critical mass of a bare plutonium-239 sphere, and it is known that as little as 4 kg can be used to make an effective nuclear explosive. Within the JCPOA it was eventually agreed that the IR-40 could be redesigned, down-rated in power to 20 MW and the new core fuelled with 3.7% enriched fuel, reducing the annual plutonium production by a factor of six.

A spin off from Double Chooz

A 10 to 20 tonne neutrino detector 20 m from the reactor would be able to measure its plutonium content with a precision of 1 to 2 kg. This would be particularly relevant in the so-called N-th month scenario, which models a potential crisis in Iran based on events in North Korea in June 1994. During the 1994 crisis, which risked precipitating war with the US, the nuclear reactor at Yongbyon was shut down, and enough spent fuel rods removed to make several bombs. IAEA protocols were sternly tested. The organisation’s conventional safeguards for operating reactors consist of containment and surveillance – seals, for example, to prevent the unnoticed opening of the reactor, and cameras to record the movement of fuel, most crucially during reactor shutdowns. In the N-th month scenario, the IR-40 reactor, in its pre-JCPOA configuration (40 MW, rather than the renegotiated power of 20 MW), runs under full safeguards for N–1 months. In month N, a planned reactor shutdown takes place. At this point the reactor would contain 8 kg of weapons-grade plutonium. For unspecified reasons the safeguards are then interrupted. In month N+1, the reactor is restarted and full safeguards are restored. The question is: are the 8 kg of plutonium still in the reactor core, or has the core been replaced with fresh fuel and the 8 kg of plutonium illicitly diverted?

The disruption of safeguards could either be due to equipment failure – a more frequent event than one might assume – or due to events in the political realm ranging from a minor unpleasantness to a full-throttle dash for a nuclear weapon. Distinguishing the two scenarios would be a matter of utmost urgency. According to an analysis including realistic backgrounds extrapolated from the PROSPECT results, this could be done in 8 to 12 weeks with a neutrino detector.

Neutrino detectors could be effective in addressing the safeguard challenges presented by advanced reactors

No conventional non-neutrino technologies can match this performance without shutting the reactor down and sampling a significant fraction of the highly radioactive fuel. The conventional approach would be extremely disruptive to reactor operations and would put inspectors and plant operators at risk of radiation exposure. Even if the host country were to agree in principle, developing a safe plan and having all sides agree on its feasibility would take months at the very least, creating dangerous ambiguity in the interim and giving hardliners on both sides time to push for an escalation of the crisis. The conventional approach would also be significantly more expensive than a neutrino detector.

New negotiating gambit

The June 1994 crisis at Yongbyon still overshadows negotiations with North Korea, since, as far as North Korea is concerned, it discredited the IAEA. Both during the crisis, and subsequently, international attempts at non-proliferation failed to prevent North Korea from acquiring nuclear weapons – its first nuclear-weapons test took place in 2006 – or even to constrain its progress towards a small-scale operational nuclear force. New approaches are therefore needed, and recent attempts by the US to achieve progress on this issue prompted an international group of about 20 neutrino experts from Europe, the US, Russia, South Korea, China and Japan to develop specific deployment scenarios for neutrino detectors at the Yongbyon nuclear complex.

The main concern is the 5 MWe reactor, which, though named for its electrical power, has a thermal power of 20 MW. This gas-cooled graphite-moderated reactor, fuelled with natural uranium, has been the source of all of North Korea’s plutonium. The specifics of this reactor, and in particular its fuel cladding, which makes prolonged wet-storage of irradiated fuel impossible, represent such a proliferation risk that anything but a monitored shutdown prior to a complete dismantling appears inappropriate. To safeguard against the regime reneging on such a deal, were it to be agreed, a relatively modest tonne-scale neutrino detector right outside the reactor building could detect a powering up of this reactor within a day.

The MiniCHANDLER detector

North Korea is also constructing the Experimental Light Water Reactor at Yongbyon. A 150 MW water-moderated reactor running with low-enriched fuel, this reactor would not be particularly well suited to plutonium production. Its design is not dissimilar to much larger reactors used throughout the world to produce electricity, and it could help address the perennial lack of electricity that has limited the development and growth of the country’s economy. North Korea may wish to operate it indefinitely. A larger, 10 tonne neutrino detector could detect any irregularities during its refuelling – a tell-tale sign of a non-civilian use of the reactor – on a timescale of three months, which is within the goals set by the IAEA.

In a different scenario, wherein the goal would be to monitor a total shutdown of all reactors at Yongbyon, it would be feasible to bury a Daya-Bay-style 50 tonne single volume detector under the Yak-san, a mountain about 2 km outside of the perimeter of the nuclear installations (see “A different scenario” figure). The cost and deployment timescale would be more onerous than in the other scenarios.

In the case of longer distances between reactor and detector, detector masses must increase to compensate an inverse-square reduction in the reactor-neutrino flux. As cosmic-ray backgrounds remain constant, the detectors must be deployed deep underground, beneath an overburden of several 100 m of rock. To this end, the UK’s Science and Technology Facilities Council, the UK Atomic Weapons Establishment and the US Department of Energy, are funding the WATCHMAN collaboration to pursue the construction of a multi-kilo-tonne water-Cherenkov detector at the Boulby mine, 20 km from two reactors in Hartlepool, in the UK. The goal is to demonstrate the ability to monitor the operational status of the reactors, which have a combined power of 3000 MW. In a use-case context this would translate to excluding the operation of an undeclared 10 to 20 MW reactor within a radius of a few kilometres , but no safeguards scenario has emerged where this would give a unique advantage.

Inverse-square scaling eventually breaks down around 100 km, as at that distance the backgrounds caused by civilian reactors far outshine any undeclared small reactor almost anywhere in the northern hemisphere. Small signals also prevent the use of neutrino detectors for nuclear-explosion monitoring, or to confirm the origin of a suspicious seismic event as being nuclear, as conventional technologies are more feasible than the very large detectors that would be needed. A more promising future application of neutrino-detector technology is to meet the new challenges posed by advanced nuclear-reactor designs.

Advanced safeguards

The current safeguards regime relies on two key assumptions: that fuel comes in large, indivisible and individually identifiable units called “fuel assemblies”, and that power reactors need to be refuelled frequently. Most advanced reactor designs violate at least one of these design characteristics. Fuel may come in thousands of small pebbles or be molten, and its coolant may not be transparent, in contrast to current designs, where water is used as moderator, coolant and storage medium in the first years after discharge. Either way, counting and identification of the fuel by serial number may be impossible. And unlike current power reactors, which are refuelled on a 12-to-18-month cycle, allowing in-core fuel to be verified as well, advanced reactors may be refuelled only once in their lifetime.

Three 20 tonne neutrino detectors

Neutrino detectors would not be hampered by any of these novel features. Detailed simulations indicate that they could be effective in addressing the safeguard challenges presented by advanced reactors. Crucially, they would work in a very similar fashion for any of the new reactor designs.

In 2019 the US Department of Energy chartered and funded a study (which I co-chair) with the goal of determining the utility of the unique capabilities offered by neutrino detectors for nuclear security and energy applications. This study includes investigators from US national laboratories and academia more broadly, and will engage and interview nuclear security and policy experts within the Department of Energy, the State Department, NGOs, academia, and international agencies such as the IAEA. The results are expected early in 2021. They should provide a good understanding of where neutrinos can play a role in current and future monitoring and verification agreements, and may help to guide neutrino detectors towards their first real-world applications.

The idea of using neutrinos to monitor reactors has been around for about 40 years. Only very recently, however, as a result of a surge of interest in sterile neutrinos, has detector technology become available that would be practical in real-world scenarios such as the JCPOA or a new North Korean nuclear agreement. The most likely initial application will be near-field reactor monitoring with detectors inside the fence of the monitored facility as part of a regional nuclear deal. Such detectors will not be a panacea to all verification and monitoring needs, and can only be effective if there is a sincere political will on both sides, but they do offer more room for creative diplomacy, and a technology that is robust against the kinds of political failures which have derailed past agreements. 

The post Neutrinos for peace appeared first on CERN Courier.

]]>
Feature Detectors similar to those used to hunt for sterile neutrinos could help guard against the extraction of plutonium-239 for nuclear weapons, writes Patrick Huber. https://cerncourier.com/wp-content/uploads/2020/10/CCNovDec20_PROLIF_yale2.jpg
CLIC lights the way for FLASH therapy https://cerncourier.com/a/clic-lights-the-way-for-flash-therapy/ Tue, 10 Nov 2020 17:25:23 +0000 https://preview-courier.web.cern.ch/?p=89890 A new collaboration between CERN and CHUV plans to use ultrafast bursts of electrons to destroy tumours.

The post CLIC lights the way for FLASH therapy appeared first on CERN Courier.

]]>
High-gradient accelerating structure

Technology developed for the proposed Compact Linear Collider (CLIC) at CERN is poised to make a novel cancer radio‑therapy facility a reality. Building on recently revived research from the 1970s, oncologists believe that ultrafast bursts of electrons damage tumours more than healthy tissue. This “FLASH effect” could be realised by using high-gradient accelerator technology from CLIC to create a new facility at Switzerland’s Lausanne University Hospital (CHUV).

Traditional radiotherapy scans photon beams from multiple angles to focus a radiation dose on tumours inside the body. More recently, hadron therapy has offered a further treatment modality: by tuning the energy of a beam of protons or ions so that they stop in the tumour, the particles deposit most of the radiation dose there (the so-called Bragg peak), while sparing the surrounding healthy tissue by comparison. Both of these treatments deliver small doses of radiation to a patient over an extended period, whereas FLASH radiotherapy is thought to require a maximum of three doses, all lasting less than 100 ms.

Look again

When the FLASH effect was first studied in the 1970s, it was assumed that all tissues suffer less damage when a dose is ultrafast, regardless of whether they are healthy or tumorous. In 2014, however, CHUV researchers published a study in which 200 mice were given a single dose of 4.5 MeV gamma rays at a conventional therapy dose-rate, while others were given an equivalent dose at the much faster FLASH-therapy rate. The results showed explicitly that while the normal tissue was damaged significantly less by the ultrafast bursts, the damage to the tumour stayed consistent for both therapies. In 2019, CHUV applied the first FLASH treatment to a cancer patient, finding similarly positive results: a 3.5 cm diameter skin tumour completely disappeared using electrons from a 5.6 MeV linear accelerator, “with nearly no side effects”. The challenge was to reach deeper tumours.

Now, using high-gradient “X-band” radio-frequency cavity technology developed for CLIC, CHUV has teamed up with CERN to develop a facility that can produce electron beams with energies around 100 MeV, in order to reach tumour depths of up to 20 cm. The idea came about three years ago when it was realised that CLIC technology was almost a perfect match for what CHUV were looking for: a high-powered accelerator, which uses X-band technology to accelerate particles over a short distance, has a high luminosity, and utilises a high current that allows a higher volume of tumour to be targeted.

“CLIC has the ability to accelerate a large amount of charge to get enough luminosity for physics studies,” explains Walter Wuensch of CERN, who heads the FLASH project at CERN. “People tend to focus on the accelerating gradient, but as important, or arguably more important, is the ability to control high-current, low-emittance beams.”

It really looks like it has the potential to be an important complement to existing radiation therapies

The first phase of the collaboration is nearing completion, with a conceptual design report, funded by CHUV, being created together by CERN and CHUV. The development and construction of the first facility, which would be housed at CHUV, is predicted to cost around €25 million, and CHUV aims to complete the facility within three years.

“The intention of CERN and the team is to be heavily involved in the process of getting the facility built and operating,” states Wuensch. “It really looks like it has the potential to be an important complement to existing radiation therapies.”

Cancer therapies have taken advantage of particle accelerators for many decades, with proton radiotherapy entering the scene in the 1990s. The CERN-based Proton-Ion Medical Machine Study, spawned by the TERA Foundation, resulted in the National Centre for Cancer Hadron Therapy (CNAO) in Italy and MedAustron in Austria, which have made significant progress in the field of proton and ion therapy. FLASH radiotherapy would add electrons to the growing modality of particle therapy.

The post CLIC lights the way for FLASH therapy appeared first on CERN Courier.

]]>
News A new collaboration between CERN and CHUV plans to use ultrafast bursts of electrons to destroy tumours. https://cerncourier.com/wp-content/uploads/2020/11/CCNovDec20_NA_CLIC.jpg
TESLA’s high-gradient march https://cerncourier.com/a/teslas-high-gradient-march/ Tue, 10 Nov 2020 17:20:31 +0000 https://preview-courier.web.cern.ch/?p=89827 The TESLA Technology Collaboration has played a major role in the development of superconducting radio-frequency cavities for a wide variety of applications.

The post TESLA’s high-gradient march appeared first on CERN Courier.

]]>
Superconducting RF cavities

Energetic beams of charged particles are essential for high-energy physics research, as well as for studies of nuclear structure and dynamics, and deciphering complex molecular structures. In principle, generating such beams is simple: provide an electric field for acceleration and a magnetic field for bending particle trajectories. In practice, however, the task becomes increasingly challenging as the desired particle energy goes up. Very high electric fields are required to attain the highest energy beams within practical real-estate constraints.

The most efficient way to generate the very high electric fields in a vacuum environment required to transport a beam is to build up a resonant excitation of radio waves inside a metallic cavity. There is something of an art to shaping such cavities to “get the best bang for the buck” for a particular application. The radio-frequency (RF) fields are inherently time-varying, and bunches of charged particles need to arrive with the right timing if they are to see only forward-accelerating electric fields. Desirable very high resonant electric fields (e.g. 5–40 MV/m) require the existence of very high currents in the cavity walls. These currents are simply not sustainable for long durations using even the best normal-conducting materials, as they would melt from resistive heating.

Superconducting materials, on the other hand, can support sustainable high-accelerating gradients with an affordable electricity bill. Early pioneering work demonstrating the first beam-acceleration using superconducting radio-frequency (SRF) cavities took place in the late 1960s and early 1970s at Stanford, Caltech, the University of Wuppertal and Karlsruhe. The potential for real utility was clear, but techniques and material refinements were needed. Several individual laboratories began to take up the challenge for their own research needs. Solutions were developed for electron acceleration at CESR, HERA, TRISTAN, LEP II and CEBAF, while heavy-ion SRF acceleration solutions were developed at Stony Brook, ATLAS, ALPI and others. The community of SRF accelerator physicists was small but the lessons learned were consistently shared and documented. By the early 1990s, SRF technology had matured such that complex large-scale systems were credible and the variety of designs and applications began to blossom.

The TESLA springboard

In 2020, the TESLA Technology Collaboration (TTC) celebrates 30 years of collaborative efforts on SRF technologies. The TTC grew out of the first international TESLA (TeV Energy Superconducting Linear Accelerator) workshop, which was held at Cornell University in July 1990. Its aim was to define the parameters for a superconducting linear collider for high-energy physics operating in the TeV region and to explore how to increase the gradients and lower the costs of the accelerating structures. It was clear from the beginning that progress would require a large international collaboration, and the Cornell meeting set in motion a series of successes that are ongoing to this day – including FLASH and the European XFEL at DESY. The collaboration also led to proposals for several large SRF-based research facilities including SNS, LCLS-II, ESS, PIP-II and SHINE, as well as a growing number of smaller facilities around the world.

Accelerating gradients above 40 MV/m are now attainable with niobium

At the time of the first TESLA collaboration meeting, the state-of-the-art in accelerating gradients for electrons was around 5 MV/m in the operating SRF systems of TRISTAN at KEK, HERA at DESY, LEP-II at CERN and CEBAF at Jefferson Lab (JLab), which were then under construction. Many participants in this meeting agreed to push for a five-fold increase in the design accelerating gradient to 25 MV/m to meet the dream goal for TESLA at a centre-of-mass energy of 1 TeV. The initial focus of the collaboration was centred on the design, construction and commissioning of a technological demonstrator, the TESLA Test Facility (TTF) at DESY. In 2004, SRF was selected as the basis for an International Linear Collider (ILC) design and, shortly afterwards, the TESLA collaboration was re-formed as the TESLA Technology Collaboration with a scope beyond the original motivation of high-energy physics. The TTC, with its incredible worldwide collaboration spirit, has had a major role in the growth of the SRF community, facilitating numerous important contributions over the past 30 years.

30 years of gradient march

Conceptually, the objective of simply providing “nice clean” niobium surfaces on RF structures seems pretty straightforward. Important subtleties begin to emerge, however, as one considers that the high RF-surface currents required to support magnetic fields up to ~100 mT flow only in the top 100 nm of the niobium surface, which must offer routine surface resistances at the nano-ohm level over areas of around 1 m2. Achieving blemish-free, contamination-free surfaces that present excellent crystal lattice structure even in this thin surface layer is far from easy.

The march of progress in cavity gradient for linacs and the many representative applications over the past 50 years (see figure “Gradient growth”) are due to breakthroughs in three main areas: material purity, fabrication and processing techniques. The TTC had a major impact on each of these areas.

RF linac accelerating gradient achievements

With some notable exceptions, bulk niobium cavities fabricated from sheet stock material have been the standard, even though the required metallurgical processes present challenges. Cycles of electron-beam vacuum refining, rolling, and intermediate anneals are provided by only a few international vendors. Pushing up the purity of deliverable material required a concerted push, resulting in the avoidance of foreign material inclusions, which can be deadly to performance when uncovered in the final step of surface processing. The figure-of-merit for purity is the ratio of room-temperature to cryogenic normal-conducting resistivity – the residual resistance ratio, RRR. The common cavity-grade niobium material specification has thus come to be known as high-RRR grade.

Another later pursuit of pure niobium is the so-called “large grain” or “direct-from-ingot” material. Rather than insist on controlled ~30 µm grain-size distribution (grains being microcrystals in the structure), this mat­erial uses sheet slices cut directly from large ingots having much larger, but arbitrarily sized, grains. Although not yet widely used, this material has produced the highest gradient TESLA-style cavities to date – 45 MV/m with a quality factor Q0 > 1010. Here again, though the topic was initiated at JLab, this fruitful work was accomplished via worldwide international collaborations.

As niobium is a refractory metal that promptly cloaks itself with about 4 nm of dielectric oxide, welding niobium components has to be performed by vacuum electron beam welding. Collaborative efforts in Europe, North America and Asia refined the parameters required to yield consistent niobium welds. The community gradually realised that extreme cleanliness is required in the surface-weld preparation, since even microscopic foreign material will be vaporised during the weld process, leaving behind small voids that become performance-limiting defects.

Having the best niobium is not sufficient, however. Superconductors have inherent critical magnetic field limitations, or equivalently local surface-current density limitations. Because the current flow is so shallow, local magnetic field enhancements induced by microscopic topography translate into gradient-limiting quench effects. Etching of fabricated surfaces has routinely required a combination of hydrofluoric and nitric acids, buffered with phosphoric acid. This exothermic etching process inherently yields step-edge faceting at grain boundaries, which in turn creates local, even nanoscopic, field enhancements, anomalous losses and quenches as the mean surface field is increased. A progression of international efforts at KEK, DESY, CEA-Saclay and JLab eliminated this problem through the development of electro-polishing techniques. Following a deeper understanding of the underlying electrochemistry, accelerating gradients above 40 MV/m are now attainable with niobium.

Another vexing problem that TTC member institutions helped to solve was the presence of “Q-drop” in the region of high surface magnetic field, for which present explanations point to subtle migration of near-surface oxygen deeper into the lattice, where it inhibits the subsequent formation of lossy nanohydrides on cool-down. Avoidance of nanohydrides, whose superconductivity by proximity effect breaks down in the Q-drop regime, is required to sustain accelerating gradients above 25 MV/m for some structures.

Cleaning up

TTC members have also shared analyses and best practices in cleaning and cleanroom techniques, which have evolved dramatically during the past 30 years. This has helped to beat down the most common challenge for developers and users of SRF accelerating cavities: particulate-induced field emission, whereby very high peak surface electric fields can turn even micron-scale foreign material into parasitic electron field emission sources, with resulting cryogenic and radiation burdens. Extended interior final rinsing with high-pressure ultra-pure water prior to cavity assembly has become standard practice, while preparation and assembly of all beamline vacuum hardware under ISO 4 cleanroom conditions is necessary to maintain these clean surfaces for accelerator operations.

ESS elliptical section

The most recent transformation has come with the recognition that interstitial doping of the niobium surface with nitrogen can reduce SRF surface resistance much more than was dreamed possible, reducing the cryogenic heat load to be cooled. While still the subject of material research, this new capability was rapidly adopted into the specification for LCLS-II cavities and is also being considered for an ILC. The effort started in the US and quickly propagated internationally via the TTC, for example in cavity tests at the European Spallation Source (see “Vertical test” image). Earlier this year, Q-values of 3–4 × 1010 at 2 K at 30 MV/m were reported in TESLA-style cavities – representing tremendous progress, but with much optimisation still to be carried out.

One of the main goals of the TTC has been to bridge the gap between state-of-the-art R&D on laboratory prototypes and actual accelerator components in operating facilities, with the clear long-term objective to enable superconducting technology for a TeV-scale linear collider. This objective demanded a staged approach and intense work on the development of all the many peripherals and subcomponents. The collaboration embraced a joint effort between the initial partners to develop the TTF at DESY, which aimed to demonstrate reliable operation of an electron superconducting linac at gradients above 15 MV/m in “vector sum” control – whereby many cavities are fed by a single high-power RF source to improve cost effectiveness. In 1993 the collaboration finalised a 1.3 GHz cavity design that is still the baseline of large projects like the European XFEL, LCLS-II and SHINE, and nearly all L-band-based facilities.

Towards a linear collider

An intense collaborative effort started for the development of all peripheral components, for example power couplers, high-order mode dampers, digital low-level RF systems and cryomodules with unprecedented heat load performances. Several of these components were designed by TTC partners in an open collaborative and competitive effort, and a number of them can be found in existing projects around the world. The tight requirements imposed by the scale of a linear collider required an integrated design of the accelerating modules, containing the cavities and their peripheral components, which led to the concept of the “TESLA style” cryomodules, variants of which provide the building blocks of the linacs in TTF, European XFEL, LCLS-II and SHINE.

Half-wave resonator string assembly

The success of the TTF, which delivered its first beam in 1997, led it to become the driver for a next-generation light source at DESY, the VUV-FEL, which produced first light in 2005 and which later became the FLASH facility. The European XFEL built on this strong heritage, its large scale demanding a new level of design consolidation and industrialisation. It is remarkable to note that the total number of such TESLA-style cavities installed or to be installed in presently approved accelerators is more than 1800. Were a 250 GeV ILC to go ahead in Japan, approximately 8000 such units would be required. (Note that an alternative proposal for a high-energy linear collider, the Compact Linear Collider, relies on a novel dual-beam acceleration scheme that does not require SRF cavities.)

Since the partners collaborating on the early TESLA goal of a linear collider were also involved in other national and international projects for a variety of applications and domains, the first decade of the 21st century saw the TTC broaden its reach. For example, we started including reports from other projects, most notably the US Spallation Neutron Source, and gradually opened to the community working on low-beta ion and proton superconducting cavities, such as the half-wave resonator string collaboratively developed at Argonne National Lab and now destined for use in PIP-II at Fermilab (see “Low-beta cavities” image). TTC meetings include topical sessions with industries to discuss how to shorten the path from development to production. Recently, the TTC has also begun to facilitate collaborative exchanges on alternative SRF materials to bulk niobium, such as Nb3Sn and even hybrid multilayer films, for potential accelerator applications.

Sustaining success

The mission of the TTC is to advance SRF technology R&D and related accelerator studies across the broad diversity of scientific applications. It is to provide a bridge for open communication and sharing of ideas, development and testing across associated projects. The TTC supports and encourages the free and open exchange of scientific and technical knowledge, engineering designs and equipment. Furthermore, it is based on cooperative work on SRF accelerator technology by research groups at TTC member institution laboratories and test facilities. The current TTC membership consists of 60 laboratories and institutes in 12 countries across Europe, North America and Asia. Since progress in cavity performance and related SRF technologies is so rapid, the major TTC meetings have been frequent.

Distribution of superconducting particle accelerators

Particle accelerators using SRF technologies have been applied widely, from small facilities for medical applications up to large-scale projects for particle physics, nuclear physics, neutron sources and free-electron lasers (see “Global view” figure). Five large-scale (> 100 cavities) SRF projects are currently under construction in three regions: ESS in Europe, FRIB and LCLS-II in the US, and SHINE (China) and RAON (Korea) in Asia. Close international collaboration will continue to support progress in these and future projects, including SRF thin-film technology relevant for a possible future circular electron–positron collider. Perhaps the next wave of SRF technology will be the maturation of economical small-scale applications with high multiplicity and international standards. As an ultimate huge future SRF project, realising an ILC will indeed require sustained broad international collaboration.

The open and free-exchange model that for 30 years has enabled the TTC to make broad progress in SRF technology is a major contribution to science diplomacy efforts on a worldwide scale. We celebrate the many creative and collaborative efforts that have served the international community well via the TESLA Technology Collaboration.

The post TESLA’s high-gradient march appeared first on CERN Courier.

]]>
Feature The TESLA Technology Collaboration has played a major role in the development of superconducting radio-frequency cavities for a wide variety of applications. https://cerncourier.com/wp-content/uploads/2020/11/CCNovDec20_TESLA_beta.jpg
Spiralling into the femtoscale https://cerncourier.com/a/spiralling-into-the-femtoscale/ Tue, 10 Nov 2020 16:24:49 +0000 https://preview-courier.web.cern.ch/?p=89794 The SPIRAL2 facility at GANIL will probe short-lived heavy nuclei and address applications in fission and materials science.

The post Spiralling into the femtoscale appeared first on CERN Courier.

]]>
Radio-frequency quadrupole

Nuclear physics is as wide-ranging and relevant today as ever before in the century-long history of the subject. Researchers study exotic systems from hydrogen-7 to the heaviest nuclides at the boundaries of the nuclear landscape. By constraining the nuclear equation of state using heavy-ion collisions, they peer inside stars in controlled laboratory tests. By studying weak nuclear processes such as beta decays, they can even probe the Standard Model of particle physics. And this is not to mention numerous applications in accelerator-based atomic and condensed-matter physics, radiobiology and industry. These nuclear-physics research areas are just a selection of the diverse work done at the Grand Accélérateur National d’Ions Lourds (GANIL), in Caen, France.

GANIL has been operating since 1983, initially using four cyclotrons, with a fifth Cyclotron pour Ions de Moyenne Energie (CIME) added in 2001. The latter is used to reaccelerate short-lived nuclei produced using beams from the other cyclotrons – the Système de Production d’Ions Radioactifs en Ligne (SPIRAL1) facility. The various beams produced by these cyclotrons drive eight beams with specialised instrumentation. Parallel operation allows the running of three experiments simultaneously, thereby optimising the available beam time. These facilities enable both high-intensity stable-ion beams, from carbon-12 to uranium-238, and lower intensity radioactive-ion beams of short-lived nuclei, with lifetimes from microseconds to milliseconds, such as helium-6, helium-8, silicon-42 and nickel-68. Coupled with advanced detectors, all these beams allow nuclei to be explored in terms of excitation energy, angular momentum and isospin.

The new SPIRAL2 facility, which is currently being commissioned, will take this work into the next decade and beyond. The most recent step forward is the beam commissioning of a new superconducting linac – a major upgrade to the existing infrastructure. Its maximum beam intensity of 5 mA, or 3 × 1016 particles per second, is more than two orders of magnitude higher than at the previous facility. The new beams and state-of-the-art detectors will allow physicists to explore phenomena at the femtoscale right up to the astrophysical scale.

Landmark facility

SPIRAL2 was approved in 2005. It now joins a roster of cutting-edge European nuclear-physics-research facilities which also features the Facility for Antiproton and Ion Research (FAIR), in Darmstadt, Germany, ISOLDE and nTOF at CERN, and the Joint Institute for Nuclear Research (JINR) in Russia. Due to their importance in the European nuclear-physics roadmap, SPIRAL2 and FAIR are both now recognised as European Strategy Forum on Research Infrastructures (ESFRI) Landmark projects, alongside 11 other facilities, including accelerator complexes such as the European X-Ray Free-Electron Laser, and telescopes such as the Square Kilometre Array.

Construction began in 2011. The project was planned in two phases: the construction of a linac for very-high-intensity stable beams, and the associated experimental halls (see “High intensity” figure); and infrastructure for the reacceleration of short-lived fission fragments, produced using deuteron beams on a uranium target through one of the GANIL cyclotrons. Though the second phase is currently on hold, SPIRAL2’s new superconducting linac is now in a first phase of commissioning.

Superconducting linac and experimental halls

Most linacs are optimised for a beam with specific characteristics, which is supplied time and again by an injector. The particle species, velocity profile of the particles being accelerated and beam intensity all tend to be fixed. By tuning the phase of the electric fields in the accelerating structures, charged particles surf on the radio-frequency waves in the cavities with optimal efficiency in a single pass. Though this is the case for most large projects, such as Linac4 at CERN, the Spallation Neutron Source (SNS) in the US and the European Spallation Source in Sweden, SPIRAL2’s linac (see “Multitasking” figure) has been designed for a wide range of ions, energies and intensities.

The multifaceted physics criteria called for an original design featuring a compact multi-cryostat structure for the superconducting cavities, which was developed in collaboration with fellow French national organisations CEA and CNRS. Though the 19 cryomodules are comparable in number to the 23 employed by the larger and more powerful SNS accelerator, the new SPIRAL2 linac has far fewer accelerating gaps. On the other hand, compared to normal-conducting cavities such as those used by Linac4, the power consumption of the superconducting structures at SPIRAL2 is significantly lower, and the linac conforms to additional constraints on the cryostat’s design, operation and cleanliness. The choice of superconducting rather than room-temperature cavities is ultimately linked not only to the need for higher beam intensities and energies, but also to the potential for the larger apertures needed to reduce beam losses.

SPIRAL2 joins a roster of cutting-edge European nuclear-physics-research facilities

Beams are produced using two specialised ion sources. At 200 kW in continuous-wave (CW) mode, the beam power is high enough to make a hole in the vacuum chamber in less than 35 µs, placing additional severe restrictions on the beam dynamics. The operation of high beam intensities, up to 5 mA, also causes space-charge effects that need to be controlled to avoid a beam halo which could activate accelerator components and generate neutrons – a greater difficulty in the case of deuteron beams.

For human safety and ease of technical maintenance, beam losses need to be kept below 1 W/m. Here, the SPIRAL2 design has synergies with several other high-power accelerators, leading to improvements in the design of quarter-wave resonator cavities. These are used at heavy-ion accelerators such as the Facility for Rare Isotope Beams in the US and the Rare Isotope Science Project in Korea; for producing radioactive-ion beams and improving beam dynamics at intense-light particle accelerators worldwide; for producing neutrons at the International Fusion Materials Irradiation Facility, the ESS, the Myrrha Multi-purpose Hybrid Research Reactor for High-tech Applications, and the SNS; and for a large range of studies relating to materials properties and the generation of nuclear power.

Beam commissioning

Initial commissioning of the linac began by sending beams from the injector to a dedicated system with various diagnostic elements. The injector was successfully commissioned with a range of CW beams, including a 5 mA proton beam, a 2 mA alpha-particle beam, a 0.8 mA oxygen–ion beam and a 25 µA argon–ion beam. In each case, almost 100% transmission was achieved through the radio-frequency quadrupoles. Components of the linac were installed, the cryomodules cooled to liquid-helium temperatures (4.5 K), and the mechanical stability required to operate the 26 superconducting cavities at their design specifications demonstrated.

Superconducting cryomodules

As GANIL is a nuclear installation, the injection of beams into the linac required permission from the French nuclear-safety authority. Following a rigorous six-year authorisation process, commissioning through the linac began in July 2019. An additional prerequisite was that a large number of safety systems be validated and put into operation. The key commissioning step completed so far is the demonstration of the cavity performance at 8 MV/m – a competitive electric field well above the required 6.5 MV/m. The first beam was injected into the linac in late October 2019. The cavities were tuned and a low-intensity 200 µA beam of protons accelerated to the design value of 33 MeV and sent to a first test experiment in the neutrons for science (NFS) area. A team from the Nuclear Physics Institute in Prague irradiated copper and iron targets and the products formed in the reaction were transported by a fast-automatic system 40 m away, where their characteristic γ-decay was measured. Precise measurements of such cross-sections are important in order to benchmark safety codes required for the operation of nuclear reactors.

SPIRAL2 is now moving towards its design power by gradually increasing the proton beam current and subsequently the duty cycle of the beam – the ratio of pulse duration to the period of the waveform. A similar procedure with alpha particles and deuteron beams will then follow. Physics programmes will begin in autumn next year.

Future physics

With the new superconducting linac, SPIRAL2 will provide intense beams from protons to nickel – up to 14.5 MeV/A for heavy ions – and continuous and quasi-mono energetic beams of neutrons up to 40 MeV. With state-of-the-art instrumentation such as the Super Separator Spectrometer (S3), the charged particle beams will allow the study of very rare events in the intense background of the unreacted beam with a signal to background fraction of 1 in 1013. The charged particle beams will also characterise exotic nuclei with properties very different from those found in nature. This will address questions related to heavy and super-heavy element/isotope synthesis at the extreme boundaries of the periodic table, and the properties of nuclei such as tin-100, which have the same number of neutrons and protons – a far cry from naturally existing isotopes such as tin-112 and tin-124. Here, ground-state properties such as the mass of nuclei must be measured with a precision of one part in 109 – a level of precision equivalent to observing the addition of a pea to the weight of an Airbus A380. SPIRAL2’s low-energy experimental hall for the disintegration, excitation and storage of radioactive ions (DESIR), which is currently under construction, will further facilitate detailed studies of the ground-state properties of exotic nuclei fed both by S3 and SPIRAL1, the existing upgraded reaccelerated exotic-beams facility. The commissioning of S3 is expected in 2023 and experiments in DESIR in 2025. In parallel, a continuous improvement in the SPIRAL2 facility will begin with the integration of a new injector to substantially increase the intensity of heavy-ion beams.

Properties must be measured with a level of precision equivalent to observing the addition of a pea to the weight of an Airbus A380

Thanks to its very high neutron flux – up to two orders of magnitude higher, in the energy range between 1 and 40 MeV, than at facilities like LANSCE at Los Alamos, nTOF at CERN and GELINA in Belgium – SPIRAL2 is also well suited for applications such as the transmutation of nuclear waste in accelerator-driven systems, the design of present and next-generation nuclear reactors, and the effect of neutrons on materials and biological systems. Light-ion beams from the linac, including alpha particles and lithium-6 and lithium-7 impinging on lead and bismuth targets, will also be used to investigate more efficient methods for the production of certain radioisotopes for cancer therapy.

Developments at SPIRAL2 are quickly moving forwards. In September, the control of the full emittance and space–charge effects was demonstrated – a crucial step to reach the design performance of the linac – and a first neutron beam was produced at NFS, using proton beams. The future looks bright. With the new SPIRAL2 superconducting linac now supplementing the existing cyclotrons, GANIL provides an intensity and variety of beams that is unmatched in a single laboratory, making it a uniquely multi-disciplinary facility in the world today.

The post Spiralling into the femtoscale appeared first on CERN Courier.

]]>
Feature The SPIRAL2 facility at GANIL will probe short-lived heavy nuclei and address applications in fission and materials science. https://cerncourier.com/wp-content/uploads/2020/11/CCNovDec20_SPIRAL_frontis.jpg
Exploring nuclei at the limits https://cerncourier.com/a/exploring-nuclei-at-the-limits/ Fri, 18 Sep 2020 09:40:00 +0000 https://preview-courier.web.cern.ch/?p=88533 Studies using traps and lasers not only help researchers understand nuclear structure, but also offer new ways to look for physics beyond the Standard Model.

The post Exploring nuclei at the limits appeared first on CERN Courier.

]]>
Chart of nuclides

Understanding how the strong interaction binds the ingredients of atomic nuclei is the central quest of nuclear physics. Since the 1960s CERN’s ISOLDE facility has been at the forefront of this quest, producing the most extreme nuclear systems for examination of their basic characteristic properties.

A chemical element is defined by the number of protons in its nucleus, with the number of neutrons defining its isotopes. Apart from a few interesting exceptions, all elements in nature have at least one stable isotope. These form the so-called valley of stability in the nuclear chart of atomic number versus neutron number (see “Nuclear landscape” figure). Adding or removing neutrons disturbs the nuclear equilibrium and creates isotopes that are generally radio­active; the greater the proton–neutron imbalance, the faster the radioactive decay.

Most of the developments have been exported to other radioactive beam facilities around the world

The mass of a nucleus reveals its binding energy, which reflects the interplay of all forces at work within the nucleus from the strong, weak and electromagnetic interactions. Indications of sudden changes in the nuclear shape, when adding neutrons, are often revealed first indirectly as a sudden change in the mass, and can then be probed in detail by measurements of the charge radius and electromagnetic moments. Such diagnosis – performed by ion-trapping and laser-spectroscopy experiments on short-lived (from a few milliseconds upwards) isotopes – provides the first vital signs concerning the nature of nuclides with extreme proton-to-neutron ratios.

Recent mass-spectrometry measurements and high-precision measurements of nuclear moments and radii at ISOLDE demonstrate the rapid progress being made in understanding the stubborn mysteries of the nucleus. ISOLDE’s state-of-the-art laser-spectroscopy tools are also opening an era where molecular radioisotopes can be used as sensitive probes for physics beyond the Standard Model.

Tools of the trade

Progress in understanding the nucleus has gone hand in hand with the advancement of new techniques. Mass measurements of stable nuclei pioneered by Francis Aston nearly a century ago revealed a near-constant binding energy per nucleon. This pointed to a characteristic saturation of the nuclear force, which underlies the liquid-drop model and led to the semi-empirical mass formula for the nucleus developed by Bethe and von Weizsäcker. With the advent of particle accelerators in the 1930s, more isotopic mass data became available from reactions and decays, bringing new surprises. In particular, comparisons with the liquid drop revealed conspicuous peaks at certain so-called “magic” numbers (8, 20, 28, 50, 82, 126), analogous to the high atomic-ionisation potentials of the closed electron-shell noble-gas elements. These findings inspired the nuclear-shell model, developed by Maria Goeppert-Mayer and Hans Jensen, which is still used as an important benchmark today. The difference with the atomic system is that the force that governs the nuclear shells is poorly understood. This is because nucleons are themselves composite particles that interact through the complex interplay of three fundamental forces, rather than the single electromagnetic force governing atomic structure. The most important question in nuclear physics today is to describe these closed shells from fundamental principles (e.g. the strong interaction between quarks and gluons inside nucleons), to understand why shell structure erodes and how new shells arise far from stability.

Laser and trap experiments in the low-energy section of ISOLDE

A key to reaching a deeper understanding of nuclear structure is the ability to measure the size and shape of nuclei. This was made possible using the precision technique of laser spectroscopy, which was pioneered with tremendous success at ISOLDE in the late 1970s. While increased binding energy is a tell-tale sign of a deforming nucleus, it gives no specific information concerning nuclear size or shape. Closed-shell configurations tend to favour spherical nuclei, but since these are rather rare, a particularly important feature of nuclei is their deformation. Inspecting electromagnetic moments derived from the measured atomic hyperfine structure and the change in charge radii derived from its isotopic shift provides detailed information about nuclear shapes and deformation, beautifully complementing mass measurements.

During the past half-century, nuclear science at ISOLDE has expanded beyond fundamental studies to applications involving radioactive tracers in materials (including biomaterials) and the fabrication of isotopes for medicine (with the MEDICIS facility). But the bulk of the ISOLDE physics programme, around 70%, is still devoted to the elucidation of nuclear structure and the properties of fundamental interactions. These studies are carried out through nuclear reactions, by decay spectroscopy, or by measuring the basic global properties – mass and size – of the most exotic species possible.

Half a century of history

The fabrication of extreme nuclear systems requires a driver accelerator of considerable energy, and CERN’s expertise here has been instrumental. After many years receiving proton beams from a 600 MeV synchrocyclotron (the SC, now a museum piece at CERN), ISOLDE now lies just off the beam line to the Proton Synchrotron (PS), receiving 1.4 GeV beam pulses from the PS Booster (see “ISOLDE from above” figure). ISOLDE in fact receives typically 50% of the pulses in the so-called super-cycle that links the intricate complex of CERN’s injectors for the LHC.

The heart of ISOLDE is a cylindrical target that can contain various different materials. The stable nuclei in the target are dissociated by the proton impact and form exotic combinations of protons and neutrons. Heating the target (up to 2000 degrees) helps these fleeting nuclides to escape into an ionisation chamber, in which they form 1+ ions that are electrostatically accelerated to around 50 keV. Isotopes of one particular mass are selected using one of two available mass separators, and subsequently delivered to the experiments through more than a dozen beamlines. A similar number of permanent experimental setups are operated by several small international collaborations. Each year, more than 40 experiments are performed at ISOLDE by more than 500 users. More than 900 users from 26 European and 17 non-European countries around the world are registered as members of the ISOLDE collaboration.

A new era for fundamental physics research has opened up

ISOLDE sets the global standard for the production of exotic nuclear species at low energies, producing beams that are particularly amenable to study using precision lasers and traps developed for atomic physics. Hence, ISOLDE is complementary to higher energy, heavy-ion facilities such as the Radioactive Isotope Beam Factory (RIBF) at RIKEN in Japan, the future Facility for Rare Isotope Beams (FRIB) in the US, and the Facility for Antiproton and Ion Research (FAIR/GSI) in Europe. These installations produce even more exotic nuclides by fragmenting heavy GeV projectiles on a thin target, and are more suitable for studying high-energy reactions such as breakup and knock-out. Since 2001, ISOLDE has also driven low-energy nuclear-reaction studies by installing a post-accelerator that enables exotic nuclides to be delivered at MeV energies for the study of more subtle nuclear reactions, such as Coulomb excitation and transfer. Post-accelerated radioactive beams have superior optical quality compared to the GeV beams from fragment separators so that the radioactive beams accelerated in the REX and more recent HIE-ISOLDE superconducting linacs enable tailored reactions to reveal novel aspects of nuclear structure.

Tuning ISOLDE’s high-precision mass spectrometer

ISOLDE’s state-of-the art experimental facilities have evolved from more than 50 years of innovation from a dedicated and close-knit community, which is continuously expanding and also includes material scientists and biochemists. The pioneering experiments concerning binding energies, charge radii and moments were all performed at CERN during the 1970s. This work, spearheaded by the Orsay group of the late Robert Klapisch, saw the first use of on-line mass separation for the identification of many new exotic species, such as 31Na. This particular success led to the first precision mass measurements in 1975 that hinted at the surprising disappearance of the N = 20 shell closure, eight neutrons heavier than the stable nucleus 23Na. In collaboration with atomic physicists at Orsay, Klapisch’s team also performed the first laser spectroscopy of 31Na in 1978, revealing the unexpected large size of this exotic isotope. To reach heavier nuclides, a mass spectrometer with higher resolution was required, so the work naturally continued at the expanding ISOLDE facility in the early 1980s.

Meanwhile, another pioneering experiment was initiated by the group of the late Ernst-Wilhelm Otten. After having developed the use of optical pumping with spectral lamps in Mainz to measure charge radii, Otten’s group exploited ISOLDE’s first offerings of neutron-deficient Hg isotopes and discovered the unique feature of shape-staggering in 1972. Through continued technical improvements, the Mainz group established the collinear laser spectroscopy (COLLAPS) programme at ISOLDE in 1979, with results on barium and ytterbium isotopes. When tunable lasers and ion traps became available in the early 1980s, the era of high-precision measurements of radii and masses began. These atomic-physics inventions have revolutionised the study of isotopes far from stability and the initial experimental set-ups are still in use today thanks to continuous upgrades and the introduction of new measurement methods. Most of these developments have been exported to other radioactive beam facilities around the world.

Mass measurements with ISOLTRAP

ISOLTRAP is one of the longest established experiments at ISOLDE. Installed in 1985 by the group of Hans-Jürgen Kluge from Mainz, it was the first Penning trap on-line at a radioactive beam facility, spawning a new era of mass spectrometry. The mass is determined from the cyclotron frequency of the trapped ion, and bringing the technique on line required significant and continuous development, notably with buffer-gas cooling techniques for ion manipulation. Today, ISOLTRAP is composed of four ion traps, each of which has a specific function for preparing the ion of interest to be weighed.

Since the first results on caesium, published in 1987, ISOLTRAP has measured the masses of more than 500 species spanning the entire nuclear chart. The most recent results, published this year by Vladimir Manea (Paris-Saclay), Jonas Karthein (Heidelberg) and colleagues, concern the strength of the N = 82 shell closure below the magic (Z = 50) 132Sn from the masses of (Z = 48) 132,130Cd. The team found that the binding energy only two protons below the closed shell was much less than what was predicted by global microscopic models, stimulating new ab-initio calculations based on a nucleon–nucleon interaction derived from QCD through chiral effective-field theory. These calculations were previously available for lighter systems but are now, for the first time, feasible in the region just south-east of 132Sn, which is of particular interest for the rapid neutron-capture process creating elements in merging neutron stars.

The other iconic doubly magic nucleus 78Ni (Z = 28, N = 50) is not yet available at ISOLDE due to the refractory nature of nickel, which slows its release from the thick target so that it decays on the way out. However, the production of copper – just one proton above – is so good that CERN’s Andree Welker and his colleagues at ISOLTRAP were recently able to probe the N = 50 shell by measuring the mass of its nuclear neighbour 79Cu, finding it to be consistent with that of the doubly magic 78Ni nucleus. Masses from large-scale shell-model calculations were in excellent agreement with the observed copper masses, indicating the preservation of the N = 50 shell strength but with some deformation energy creeping in to help. Complementary observables from laser spectroscopy helped to tell the full story, with results on moments and radii from the COLLAPS and the more recent Collinear Resonance Ionization Spectroscopy (CRIS) experiments adding an interesting twist.

Laser spectroscopy with COLLAPS and CRIS

Quantum electrodynamics provides its predictions of atomic energy levels mostly by assuming the nucleus is point-like and infinitely heavy. However, the nucleus indeed has a finite mass as well as non-zero charge and current distributions, which impact the fine structure. Thus, complementary to the high-energy scattering experiments used to probe nuclear sizes, the energy levels of orbiting electrons offer a marvellous probe of the electric and magnetic properties of the nucleus. This fact is exploited by the elegant technique of laser spectroscopy, a fruitful marriage of atomic and nuclear physics realised by the COLLAPS collaboration since the late 1970s. COLLAPS uses tunable continuous-wave lasers for high-precision studies of exotic nuclear radii and moments, and similar setups are now running at other facilities, such as Jyvaskyla in Finland, TRIUMF in Canada and NSCL-MSU in the US.

A recent highlight from COLLAPS, obtained this year by Simon Kaufmann of TU Darmstadt and co-workers, is the measurement of the charge radius of the exotic, semi-magic isotope 68Ni. Such medium-mass exotic nuclei are now in reach of the modern ab-initio chiral effective-field theories, which reveal a strong correlation between the nuclear charge radius and its dipole polarisability. With both measured for 68Ni, the data provide a stringent benchmark for theory, and allow researchers to constrain the point-neutron radius and the neutron skin of 68Ni. The latter, in turn, is related to the nuclear equation-of-state, which plays a key role in supernova explosions and compact-object mergers, such as the recent neutron-star merger GW170817.

CRIS collaborators

Building on pioneering work by COLLAPS, the collinear laser beamline, CRIS, was constructed at ISOLDE 10 years ago by a collaboration between the groups of Manchester and KU Leuven. In CRIS, a bunched atom beam is overlapped with two or three pulsed laser beams that are resonantly laser-ionised via a particular hyperfine transition. These ions are then deflected from the remaining background atoms and counted in quasi background-free conditions. CRIS has dramatically improved the sensitivity of the collinear laser spectroscopy method so that beams containing just a few tens of ions per second can now be studied with the same resolution as the optical technique of COLLAPS.

Ruben de Groote of KU Leuven and co-workers recently used CRIS to study the moments and charge radii of the copper isotopes up to 78Cu, providing critical information on the wave function and shape of these exotic neighbours, and insight on the doubly magic nature of 78Ni. Both the ISOLTRAP and CRIS results provide a consistent picture of fragile equilibrium in 78Ni, where the failing strength of the proton and neutron shell closures is shored up with binding energy brought by slight deformation.

These precision measurements in new regions of the nuclear chart bring complementary observables that must be coherently described by global theoretical approaches. They have stimulated and guided the development of new ab-initio results, which now allow the properties of extreme nuclear matter to be predicted. While ISOLDE cannot produce absolutely all nuclides on the chart (for example, the super-heavy elements), precision tests in other, key regions provide confidence in the global-model predictions in regions unreachable by experiment.

Searches for new physics

By combining the ISOLDE expertise in radioisotope production with the mass spectrometry feats of ISOLTRAP and the laser spectroscopy prowess from the CRIS and RILIS (Resonant Ionization Laser Ion Source) teams, a new era for fundamental physics research has opened up. It is centred on the ability of ISOLDE to produce short-lived radioactive molecules composed of heavy pear-shaped nuclei, in which a putative electric dipole moment (EDM) would be amplified to offer a sensitive test of time-reversal and other fundamental symmetries. Molecules of radium fluoride (RaF) are predicted to be the most sensitive probes for such precision studies: the heavy mass and octupole-deformed (pear shape) of some radium isotopes, immersed in the large electric field induced by the molecular RaF environment, makes these molecules very sensitive probes for symmetry-violation effects, such as the existence of an EDM. However, these precision studies require laser cooling of the RaF molecules, and since all isotopes of Ra are radioactive, the molecular spectroscopy of RaF was only known theoretically.

ISOLDE’s Collinear Laser Spectroscopy experiment

This year, for the very first time, an ISOLDE collaboration led by CRIS collaborator Ronald Garcia Ruiz at CERN was able to produce, identify and study the spectroscopy of RaF molecules, containing different long-lived radioisotopes of radium. Specific Ra isotopes were chosen because of their octupole nature, as revealed by experiments at the REX- and HIE-ISOLDE accelerators in 2013 and 2020. The measured molecular excitation spectral properties provide clear evidence for an efficient laser-cooling scheme, providing the first step towards precision studies.

Many interesting new-physics opportunities will open up using different kinds of radioactive molecules tuned for sensitivity to specific symmetry violation aspects to test the Standard Model, but also with potential impact in nuclear physics (for example, enhanced sensitivity to specific moments), chemistry and astrophysics. This will also require dedicated experimental set-ups, combining lasers with traps. The CRIS collaboration is preparing these new set-ups, and the ability to produce RaF and other radioactive molecules is also under investigation at other facilities, including TRIUMF and the low-energy branch at FRIB. More than 50 years after its breakthrough beginning, ISOLDE continues to forge new paths both in applied and fundamental research. 

The post Exploring nuclei at the limits appeared first on CERN Courier.

]]>
Feature Studies using traps and lasers not only help researchers understand nuclear structure, but also offer new ways to look for physics beyond the Standard Model. https://cerncourier.com/wp-content/uploads/2020/09/CCSepOct20_ISOLDE_feature.jpg
ESS under construction https://cerncourier.com/a/ess-under-construction/ Fri, 18 Sep 2020 09:26:17 +0000 https://preview-courier.web.cern.ch/?p=88584 The European Spallation Source will provide neutron beams 100 times brighter than those from reactor sources, enabling new research into material properties and fundamental physics.

The post ESS under construction appeared first on CERN Courier.

]]>
Aerial view of the ESS

Just a few years after the discovery of the neutron by James Chadwick in 1932, investigations into the properties of neutrons by Fermi and others revealed the strong energy dependence of the neutron’s interactions with matter. This knowledge enabled the development of sustainable neutron production by fission, opening the era of atomic energy. The first nuclear-fission reactors in the 1940s were also equipped with the capacity for materials irradiation, and some provided low-energy (thermal) neutron beams of sufficient intensity for studies of atomic and molecular structure. Despite the high cost of investment in nuclear-research reactors, neutron science flourished to become a mainstay among large-scale facilities for materials research around the world.

The electrical neutrality of neutrons allows them to probe deep into matter in a non-destructive manner, where they scatter off atomic nuclei to reveal important information about atomic and molecular structure and dynamics. Neutrons also carry a magnetic moment. This property, combined with their absence of electric charge, make neutrons uniquely sensitive to magnetism at an atomic level. On the downside, the absence of electric charge means that neutron-scattering cross sections are much weaker than they are for X-rays and electrons, making neutron flux a limiting factor in the power of this method for scientific research.

ESS site layout

Throughout the 1950s and 1960s, incremental advances in the power of nuclear-research reactors and improvements in moderator design provided increasing fluxes of thermal neutrons. In Europe these developments culminated in the construction of the 57 MW high-flux reactor (HFR) at the Institut Laue-Langevin (ILL) in Grenoble, France, with a compact core containing 9 kg of highly enriched uranium enabling neutron beams with energies from around 50 μeV to 500 meV. When the HFR came into operation in 1972, however, it was clear that nuclear-fission reactors were already approaching their limit in terms of steady-state neutron flux (roughly 1.5 × 1015 neutrons per cm2 per second).

Spallation has long been hailed as the method with the potential to push through to far greater neutron fluxes

In an effort to maintain pace with advances in other methods for materials research, such as synchrotron X-ray facilities and electron microscopy, accelerator-based neutron sources were established in the 1980s in the US (IPNS and LANSCE), Japan (KENS) and the UK (ISIS). Spallation has long been hailed as the method with the potential to push through to far greater neutron fluxes, and hence to provide a basis for continued growth of neutron science. However, after nearly 50 years of operation, and with 10 more modern medium- to high-flux neutron sources (including five spallation sources) in operation around the world, the HFR is still the benchmark source for neutron-beam research. Of the spallation sources, the most powerful (SNS at Oak Ridge National Laboratory in the US and J-PARC in Japan) have now been in operation for more than a decade. SNS has reached its design power of 1.4 MW, and J-PARC is planning for tests at 1 MW. At these power levels the sources are competitive with ILL for leading-edge research. It has long been known that the establishment of a new high-flux spallation neutron facility is needed if European science is to avoid a severe shortage in access to neutron science in the coming years (CERN Courier May/June 2020 p49).

Unprecedented performance

The European Spallation Source (ESS), with a budget of €1.8 billion (2013 figures), is a next-generation high-flux neutron source that is currently entering its final construction phase. Fed by a 5 MW proton linac, and fitted with the most compact neutron moderator and matched neutron transport systems, at full power the brightness of the ESS neutron beams is predicted to exceed the HFR by more than two orders of magnitude.

Target station monolith

The idea for the ESS was advanced in the early 1990s. The decision in 2009 to locate it in Lund, Sweden, led to the establishment of an organisation to build and operate the facility (ESS AB) in 2010. Ground-breaking took place in 2014, and today construction is in full swing, with first science expected in 2023 and full user operation in 2026. The ESS is organised as a European Research Infrastructure Consortium (ERIC) and at present has 13 member states: Czech Republic, Denmark, Estonia, France, Germany, Hungary, Italy, Norway, Poland, Spain, Sweden, Switzerland and the UK. Sweden and Denmark are the host countries, providing nearly half of the budget for the construction phase. Around 70% of the funding from the non-host countries is in the form of in-kind contributions, meaning that the countries are delivering components, personnel or other support services to the facility rather than cash.

The unprecedented brightness of ESS neutrons will enable smaller samples, faster measurements and more complex experiments than what is possible at existing neutron sources. This will inevitably lead to discoveries across a wide range of scientific disciplines, from condensed-matter physics, solid-state chemistry and materials sciences, to life sciences, medicine and cultural heritage. A wide range of industrial applications in polymer science and engineering are also anticipated, while new avenues in fundamental physics will be opened (see “Fundamental physics at the ESS” panel).

Fundamental physics at the ESS

The ESS will offer a multitude of opportunities for fundamental physics with neutrons, neutrinos and potentially other secondary particles from additional target stations. While neutron brightness and pulse time structure are key parameters for neutron scattering (the main focus of ESS experiments), the total intensity is more important for many fundamental-physics experiments.

A cold neutron-beam facility for particle physics called ANNI is proposed to allow precision measurements of the beta decay, hadronic weak interactions and electromagnetic properties of the neutron. ANNI will improve the accuracy of measurements of neutron beta decay by an order of magnitude. Experiments will probe a broad range of new-physics models at mass scales from 1 to 100 TeV, far beyond the threshold of direct particle production at accelerators, and resolve the tiny effects of hadronic weak interactions, enabling quantitative tests of the non-perturbative limit of quantum chromodynamics.

Another collaboration is proposing a two-stage experiment at the ESS to search for baryon-number violation. The first stage, HIBEAM, will look for evidence for sterile neutrinos. As a second stage, NNBAR could be installed at the large beam port, with the purpose to search for oscillations between neutrons and anti-neutrons. Observing such a transition would show that the baryon number is violated by two units and that matter containing neutrons is unstable, potentially shedding light on the observed baryon asymmetry of the universe.

A design study, financed through the European Commission’s Horizon 2020 programme, is also under way for the ESS Neutrino Super Beam (ESSνSB) project. This ambitious project would see an accumulator ring and a separate neutrino target added to the ESS facility, with the aim of sending neutrinos to a large underground detector in mid-Sweden, 400–500 km from the ESS. Here, the neutrinos would be detected at their second oscillation maximum, giving the highest sensitivity for discovery and/or measurement of the leptonic CP-violating phase. An accumulator ring and the resulting short proton pulses needed by ESSνSB would open up for other kinds of fundamental physics as well as for new perspectives in neutron scattering, and muon storage rings.

Finally, a proposal has been submitted to ESS concerning coherent neutrino–nucleus scattering (CEνNS). The high proton beam power together with the 2 GeV proton energy will provide a 10 times higher neutrino flux from the spallation target than previously obtained for CEνNS. Measured for the first time by the COHERENT collaboration in 2017 at ORNL’s Spallation Neutron Source, CEνNS offers a new way to probe the properties of the neutrino including searches for sterile neutrinos and a neutrino magnetic moment, and could help reduce the mass of neutrino detectors.

From the start, the ESS has been driven by the neutron-scattering community, with strong involvement from all the leading neutron-science facilities around Europe. To maximise its scientific potential, a reference set of 22 instrument concepts was developed from which 15 instruments covering a wide range of applications were selected for construction. The suite includes three diffractometers for hard-matter structure determination, a diffractometer for macromolecular crystallography, two small-angle scattering instruments for the study of large-scale structures, two reflectometers for the study of surfaces and interfaces, five spectrometers for the study of atomic and molecular dynamics over an energy range from a few μeV to several hundred meV, a diffractometer for engineering studies and a neutron imaging station (see “ESS layout” figure). Given that the ESS target system has the capacity for two neutron moderators and that the beam extraction system allows viewing of each moderator by up to 42 beam ports, there is the potential for many more neutron instruments without major investment in the basic infrastructure. The ESS source also has a unique time structure, with far longer pulses than existing pulsed sources, and an innovative bi-spectral neutron moderator, which allows a high degree of flexibility in the choice of neutron energy.

Accelerator and target

Most of the existing spallation neutron sources use a linear accelerator to accelerate protons to high energies. The particles are stored in an accumulator ring and are then extracted in a short pulse (typically a few microseconds in length) to a heavy-metal spallation target such as tungsten or mercury, which have a high neutron yield. A notable exception is SINQ at PSI, which uses a cyclotron that produces a continuous beam.

A section of the cryogenic system

ESS has a linear accelerator but no accumulator ring, and it will thus have far longer proton pulses of 2.86 ms. This characteristic, combined with the 14 Hz repetition rate of the ESS accelerator, is a key advantage of the ESS for studies of condensed matter, because it allows good energy resolution and broad dynamic range. The result is a source with unprecedented flexibility to be optimised for studies from condensed-matter physics and solid-state chemistry, to polymers and the biological sciences with applications to medical research, industrial materials and cultural heritage. The ESS concept is also of major benefit for experiments in fundamental physics, where the total integrated flux is a main figure of merit.

The high neutron flux at ESS is possible because it will be driven by the world’s most powerful particle accelerator, in terms of MW of beam on target. It will have a proton beam of 62.5 mA accelerated to 2 GeV, with most of the energy gain coming from superconducting radio-frequency cavities cooled to 2 K. Together with its long pulse structure, this gives 5 MW average power and 125 MW of peak power. For proton energies around a few GeV, the neutron production is nearly proportional to the beam power, so the ratio between beam current and beam energy is to a large extent the result of a cost optimisation, while the pulse structure is set by requirements from neutron science.

Linac installation

The neutrons are produced by spallation when the high-energy protons hit the rotating tungsten target. The 2.5 m-diameter target wheel consists of 36 sectors of tungsten blocks inside a stainless-steel disk. It is cooled by helium gas, and it rotates at approximately 0.4 Hz, such that successive beam pulses hit adjacent sectors to allow adequate heat dissipation and limiting radiation damage. The neutrons enter moderator–reflector systems above or below the target wheel. The unique ESS “butterfly” moderator design consists of interpenetrating vessels of water and parahydrogen, and allows viewing of either or both vessels from a 120° wide array of beam ports on either side. The moderator is only 3 cm high, ensuring the highest possible brightness. Thus each instrument is fed by an intense mix of thermal (room temperature) and cold (20 K) neutrons that is optimised to its scientific requirements. The neutrons are transported to the instruments through neutron-reflecting guides that are up to 165 m long. Neutron optics are quite challenging, due to the weak cross-sections, which makes the technology for transporting neutrons sophisticated. The guides consist of optically flat glass or metal channels coated with many thin alternating layers of nickel and titanium, in a sequence designed to enhance the critical angle for reflection. The optical properties of the guides allow for broad spectrum focusing to maximise intensity for varying sample sizes, typically in the range from a few mm3 to several cm3.

Under construction

Construction of the ESS has been growing in intensity since it began in 2014. The infrastructure part was organised differently compared to other scientific large-scale research facilities. A partnering collaboration agreement was set up with the main contractor (Skanska), with separate agreements for the design and target cost settled at the beginning of different stages of the construction to make it a shared interest to build the facility within budget and schedule.

Every year, up to 3000 researchers from all over the world are expected to carry out around 1000 experiments

Today, all the accelerator buildings have been handed over from the contractor to ESS. The ion source, where the protons are produced from hydrogen gas, was delivered from INFN in Catania at the end of 2017. After installation, testing and commissioning to nominal beam parameters, the ion source was inaugurated by the Swedish king and the Italian president in November 2018. Since then, the radio-frequency quadrupole and other accelerator components have been put into position in the accelerator tunnel, and the first prototype cryomodule has been cooled to 2 K. There is intense installation activity in the accelerator, where 5 km of radio-frequency waveguides are being mounted, 6000 welds of cooling-water pipes performed and 25,000 cables being pulled. The target building is under construction, and has reached its full height of 31 m. The large target vacuum vessel is due to arrive from in-kind partner ESS Bilbao in Spain later this year, and the target wheel in early 2021.

The handover of buildings for the neutron instruments started in September 2019, with the hall of the long instruments along with the buildings housing associated laboratories and workshops. While basic infrastructure such as the neutron bunker and radiation shielding for the neutron guides are provided by ESS in Lund, European partner laboratories are heavily involved in the design and construction of the neutron instruments and the sample-environment equipment. ESS has developed its own detector and chopper technologies for the neutron instruments, and these are being deployed for a number of the instruments currently under construction. In parallel, the ESS Data Management and Software Centre, located in Copenhagen, Denmark, is managing the development of instrument control, data management and visualisation and analysis systems. During full operation, the ESS will produce scientific data at a rate of around 10 PB per year, while the complexity of the data-handling requirements for the different instruments and the need for real-time visualisation and processing add additional challenges.

A linac warm unit

The major upcoming milestones for the ESS project are beam-on-target, when first neutrons are produced, and first-science, when the first neutron-scattering experiments take place. According to current schedules, these milestones will be reached in October 2022 and July 2023, respectively. Although beam power at the first-science milestone is expected to be around 100 kW, performance simulations indicate that the quality of results from first experiments will still have a high impact with the user community. The initiation of an open user programme, with three or more of the neutron instruments beginning operation, is expected in 2024, with further instruments becoming available for operation in 2025. When the construction phase ends in late 2025, ESS is expected to be operating at 2 MW, and all 15 neutron instruments will be in operation or ready for hot-commissioning.

The ESS has been funded to provide a service to the scientific community for leading-edge research into materials properties. Every year, up to 3000 researchers from all over the world are expected to carry out around 1000 experiments there. Innovation in the design of the accelerator, the target system and its moderators, and in the key neutron technologies of the neutron instruments (neutron guides, detectors and choppers), ensure that the ESS will establish itself at the vanguard of scientific discovery and development well into the 21st century. Furthermore, provision has been made for the expansion of the ESS to provide a platform for leading-edge research into fundamental physics and as yet unidentified fields of research.

The post ESS under construction appeared first on CERN Courier.

]]>
Feature The European Spallation Source will provide neutron beams 100 times brighter than those from reactor sources, enabling new research into material properties and fundamental physics. https://cerncourier.com/wp-content/uploads/2020/09/CCSepOct20_ESS_monolith.jpg
Neutron sources join the fight against COVID-19 https://cerncourier.com/a/neutron-sources-join-the-fight-against-covid-19/ Tue, 07 Jul 2020 11:29:44 +0000 https://preview-courier.web.cern.ch/?p=87713 Advanced neutron facilities such as the Institut Laue-Langevin are gearing up to enable a deeper understanding of the structural workings of SARS-CoV-2.

The post Neutron sources join the fight against COVID-19 appeared first on CERN Courier.

]]>
The LADI instrument at the ILL

The global scientific community has mobilised at an unprecedented rate in response to the COVID-19 pandemic, beyond just pharmaceutical and medical researchers. The world’s most powerful analytical tools, including neutron sources, harbour the unique ability to reveal the invisible, structural workings of the virus – which will be essential to developing effective treatments. Since the outbreak of the pandemic, researchers worldwide have been using large-scale research infrastructures such as synchrotron X-ray radiation sources (CERN Courier May/June 2020 p29), as well as cryogenic electron microscopy (cryo-EM) and nuclear magnetic resonance (NMR) facilities, to determine the 3D structures of proteins of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), which can lead to COVID-19 respiratory disease, and to identify potential drugs that can bind to these proteins in order to disable the viral machinery. This effort has already delivered a large number of structures and increased our understanding of what potential drug candidates might look like in a remarkably short amount of time, with the number increasing each week.

COVID-19 impacted the operation of all advanced neutron sources worldwide. With one exception (ANSTO in Australia, which continued the production of radioisotopes) all of them were shut down in the context of national lockdowns aimed at reducing the spread of the disease. The neutron community, however, lost no time in preparing for the resumption of activities. Some facilities like Oak Ridge National Laboratory (ORNL) in the US have now restarted operation of their sources exclusively for COVID-19 studies. Here in Europe, while waiting (impatiently) for the restart of neutron facilities such as the Institut Laue-Langevin (ILL) in Grenoble, which is scheduled to be operational by mid-August, scientists have been actively pursuing SARS-CoV-2-related projects. Special research teams on the ILL site have been preparing for experiments using a range of neutron-scattering techniques including diffraction, small-angle neutron scattering, reflectometry and spectroscopy. Neutrons bring to the table what other probes cannot, and are set to make an important contribution to the fight against SARS-CoV-2.

Unique characteristics

Discovered almost 90 years ago, the neutron has been put to a multitude of uses to help researchers understand the structure and behaviour of condensed-matter. These applications include a steadily growing number of investigations into biological systems. For the reasons explained below, these investigations are complementary to the use of X-rays, NMR and cryo-EM. The necessary infrastructure for neutron-scattering experiments is provided to the academic and industrial user communities by a global network of advanced neutron sources. Leading European neutron facilities include the ILL in Grenoble, France, MLZ in Garching, Germany, ISIS in Didcot, UK, and PSI in Villigen, Switzerland. The new European flagship neutron source – the European Spallation Source (ESS) – is under construction in Lund, Sweden.

Structural power

Determining the biological structures that make up a virus such as SARS-CoV-2 (pictured) allows scientists to see what they look like in three dimensions and to understand better how they function, speeding up the design of more effective anti-viral drugs. Knowledge of the structures highlights which parts are the most important: for example, once researchers know what the active site in an enzyme looks like, they can try to design drugs that fit well into the active site – the classic “lock-and-key” analogy. This is also useful in the development of vaccines. Knowledge of the structural components that make up a virus are important since vaccines are often made from weakened or killed forms of the microbe, its toxins, or one of its surface proteins.

Neutrons are a particularly powerful tool for the study of biological macromolecules in solutions, crystals and partially ordered systems. Their neutrality means neutrons can penetrate deep into matter without damaging the samples, so that experiments can be performed at room temperature, much closer to physiological temperatures. Furthermore, in contrast to X-rays, which are scattered by electrons, neutrons are scattered by atomic nuclei, and so neutron-scattering lengths show no correlation with the number of electrons, but rather depend on nuclear forces, which can even vary between different isotopes. As such, while hydrogen (H) scatters X-rays very weakly, and protons (H+) do not scatter X-rays at all, with neutrons hydrogen scatters at a similar level to the other common elements (C, N, O, S, P) of biological macromolecules, allowing them to be located. Moreover, since hydrogen and its isotope deuterium (2H/D) exhibit different scattering lengths and signs, this can be exploited in neutron studies to enhance the visibility of specific structural features by substituting one isotope for the other. Examples of this include small-angle neutron scattering (SANS) studies of macromolecular structures that provide low-resolution 3D information on molecular shape without the need for crystallization, and neutron-crystallography studies of proteins that provide high-resolution structures of proteins, including the locations of individual hydrogen atoms that have been exchanged for deuterium to make them particularly visible. Indeed, neutron crystallography can provide unique information on the chemistry occurring within biological macromolecules, such as enzymes, as recent studies on HIV-1 protease, an enzyme essential for the life-cycle of the HIV virus, illustrate.

Treating and stopping COVID-19

Proteases are like biological scissors that cleave polypeptide chains – the primary structure of proteins – at precise locations. If the cleavage is inhibited, for example, by appropriate anti-viral drugs, then so-called poly-proteins remain in their original state and the machinery of virus replication is blocked. For the treatment to be efficient this inhibition has to be robust—that is, the drug occupying the active site should be strongly bound, ideally to atoms in the main chain of the protease. This will increase the likelihood that treatments are effective in the long run, despite mutations of the enzyme, since mutations occur only within the side chains of the enzyme. Neutron research, therefore, provides essential input into the long-term development of pharmaceuticals. This role will be further enhanced in the context of advanced computer-aided drug development that will rely on an orchestrated combination of high-power computing, artificial intelligence and broad-band experimental data on structures.

A neutron Laue diffraction pattern

Neutron crystallography data add supplementary structural information to X-ray data by providing key details regarding hydrogen atoms and protons, which are critical players in the binding of such drugs to their target enzyme through hydrogen bonding, and revealing important details of protein chemistry that help researchers decipher the exact enzyme catalytic pathway. In this way, neutron crystallography data can be hugely beneficial towards understanding how these enzymes function and the design of more effective medications to target them. For example, in the study of complexes between HIV-1 protease – the enzyme responsible for maturation of virus particles into infectious HIV virions – and drug molecules, neutrons can reveal hydrogen-bonding interactions that offer ways to enhance drug-binding and reduce drug-resistance of anti-retroviral therapies.

More than half of the SARS-CoV-2-related structures determined thus far are high-resolution X-ray structures of the virus’s main protease, with the majority of these bound to potential inhibitors. One of the main challenges for performing neutron crystallography is that larger crystals are required than for comparable X-ray crystallography studies, owing to the lower flux of neutron beams relative to X-ray beam intensities. Nevertheless, given the benefits provided by the visualisation of hydrogen-bonding networks for understanding drug-binding, scientists have been optimising crystallisation conditions for the growth of larger crystals, in combination with the production of fully deuterated protein in preparation for neutron crystallography experiments in the near future. Currently, teams at ORNL, ILL and the DEMAX facility in Sweden are growing crystals for SARS-CoV-2 investigations.

Proteases are, however, not the only proteins where neutron crystallography can provide essential information. For example, the spike protein (S-protein) of SARS-CoV-2 that is responsible for mediating the attachment and entry into human cells is of great relevance for developing therapeutic defence strategies against the virus. Here, neutron crystallography can potentially provide unique information about the specific domain of the S-protein where the virus binds to human cell receptors. Comparison of the structure of this region between different variations of coronavirus (SARS-CoV-2 and SARS-CoV) obtained using X-rays suggests small alterations to the amino-acid sequence may enhance the binding affinity of the S-protein to the human receptor hACE2, making SARS-CoV-2 more infectious. Neutron studies will provide further insight into this binding, which is crucial for the attachment of the virus. These experiments are scheduled to take place, e.g. at ILL and ORNL (and possibly MLZ), as soon as large enough crystals have been grown.

The big picture

Biological systems have a hierarchy of structures: starting from molecules that assemble into structures such as proteins; these form complexes which, as supramolecular arrangements like membranes, are the building blocks of cells. These are of course the building blocks of our bodies. Every part of this huge machinery is subject to continuous reorganisation. To understand the functioning, or in the case of a disease, the malfunctioning of a biological system, we therefore must get insight into the biological mechanism on all of these different length scales.

The ILL reactor

When it comes to studying the function of larger biological complexes such as assembled viruses, SANS becomes an important analytical tool. The technique’s capacity to distinguish specific regions (RNA, proteins and lipids) of the virus – thanks to advanced deuteration methods – enables researchers to map out the arrangement of the various components, contributing invaluable information to structural studies of SARS-CoV-2. While other analytical techniques provide the detailed atomic-
resolution structure of small biological assemblies, neutron scattering allows researchers to pan back to see the larger picture of full molecular complexes, at lower resolution. Neutron scattering is also uniquely suited to determining the structure of functional membrane proteins in physiological conditions. Neutron scattering will therefore make it possible to map out the structure of the complex formed by the S-protein and the hACE2 receptor.

Neutrons can penetrate deep into matter without damaging the samples

Last but not least, a full understanding of the virus’s life cycle requires the study of the interaction of the virus with the cell membrane, and the mechanism it uses to penetrate the host cell. SARS-CoV-2 is a virus, like HIV, that possesses a viral envelope composed of lipids, proteins and sugars. By providing information on its molecular structure and composition, the technique of neutron reflectometry – whereby highly collimated neutrons are incident on a flat surface and the intensity of reflected radiation is measured as a function of angle or neutron wavelength – helps to elucidate the precise mechanism the virus uses to penetrate the cell. Like in the case of SANS, the strength of neutron reflectometry relies on the fact that it provides a different contrast to X-rays, and that this contrast can be varied via deuteration allowing, for example, to distinguish a protein inserted into the membrane from the membrane itself. Regarding SARS-CoV-2, this implies that neutron reflectometry can in fact provide detailed structural information on the interaction of small protein fragments, so-called peptides, that mimic the S-protein and that are believed to be responsible for binding with the receptor of the host cell. Defining this mechanism, which is decisive for the infection, will be essential to controlling the virus and its potential future mutations in the long term.

Tool of choice

And we should not forget that viruses in their physiological environments are highly dynamic systems. Knowing how they move, deform and cluster is essential for optimising diagnostic and therapeutic treatments. Neutron spectroscopy, which is ideally suited to follow the motion of matter from small chemical groups to large macromolecular assemblies, is the tool of choice to provide this information.

The League of Advanced European Neutron Sources (CERN Courier May/June 2020 p49) has rapidly mobilised to conduct all relevant experiments. We are equally in close contact with our international partners, some of whom have, or are just in the process of, reopening their facilities. Scientists have to make sure that each research subject is provided with the best-suited analytical tool – in other words, those that have the samples will be given the necessary beam time. Neutron facilities are fast-adapting with special access channels to beam time having been implemented to allow the scientific community to respond without delay to the challenge posed by COVID-19.

The post Neutron sources join the fight against COVID-19 appeared first on CERN Courier.

]]>
Feature Advanced neutron facilities such as the Institut Laue-Langevin are gearing up to enable a deeper understanding of the structural workings of SARS-CoV-2. https://cerncourier.com/wp-content/uploads/2020/07/CCJulAug20_NEUTRON_frontis.jpg
CLOUD clarifies cause of urban smog https://cerncourier.com/a/cloud-clarifies-cause-of-urban-smog/ Tue, 07 Jul 2020 10:56:50 +0000 https://preview-courier.web.cern.ch/?p=87722 The result could shape policies for reducing urban particle pollution.

The post CLOUD clarifies cause of urban smog appeared first on CERN Courier.

]]>
Urban flow patterns

Urban particle pollution ranks fifth in the risk factors for mortality worldwide, and is a growing problem in many built-up areas. In a result that could help shape policies for reducing such pollution, the CLOUD collaboration at CERN has uncovered a new mechanism that drives winter smog episodes in cities.

Winter urban smog episodes occur when new particles form in polluted air trapped below a temperature inversion: warm air above the inversion inhibits convection, causing pollution to build up near the ground. However, how additional aerosol particles form and grow in this highly polluted air has puzzled researchers because they should be rapidly lost through scavenging by pre-existing aerosol particles. CLOUD, which uses an ultraclean cloud chamber situated in a beamline at CERN’s Proton Synchrotron to study the formation of aerosol particles and their effect on clouds and climate, has found that ammonia and nitric acid can provide the answer.

Deriving in cities mainly from vehicle emissions, ammonia and nitric acid were previously thought to play a passive role in particle formation, simply exchanging with ammonium nitrate in the particles. However, the new CLOUD study finds that small inhomogeneities in the concentrations of ammonia and nitric acid can drive the growth rates of newly formed particles up to more than 100 times faster than seen before, but only in short spurts that have previously escaped detection. These ultrafast growth rates are sufficient to rapidly transform the newly formed particles to larger sizes, where they are less prone to being lost through scavenging, leading to a dense smog
episode with a high number of particles.

“Although the emission of nitrogen oxides is regulated, ammonia emissions are not and may even be increasing with the latest catalytic converters used in gasoline and diesel vehicles,” explains CLOUD spokesperson Jasper Kirkby. “Our study shows that regulating ammonia emissions from vehicles could contribute to reducing urban smog.”

The post CLOUD clarifies cause of urban smog appeared first on CERN Courier.

]]>
News The result could shape policies for reducing urban particle pollution. https://cerncourier.com/wp-content/uploads/2020/07/CCJulAug20_NA_cloud.jpg
Lofty thinking https://cerncourier.com/a/lofty-thinking/ Tue, 07 Jul 2020 09:14:44 +0000 https://preview-courier.web.cern.ch/?p=87746 CERN’s CLOUD experiment has merged the best of particle physics and atmospheric science into a novel experimental approach, says spokesperson Jasper Kirkby.

The post Lofty thinking appeared first on CERN Courier.

]]>
Jasper Kirkby

What, in a nutshell, is CLOUD?

It’s basically a cloud chamber, but not a conventional one as used in particle physics. We realistically simulate selected atmospheric environments in an ultraclean chamber and study the formation of aerosol particles from trace vapours, and how they grow to become the seeds for cloud droplets. We can precisely control all the conditions found throughout the atmosphere such as gas concentrations, temperature, ultraviolet illumination and “cosmic ray” intensity with a beam from CERN’s Proton Synchrotron (PS). The aerosol processes we study in CLOUD are poorly known yet climatically important because they create the seeds for more than 50% of global cloud droplets.

We have 22 institutes and the crème de la crème of European and US atmospheric and aerosol scientists. It’s a fabulous mixture of physicists and chemists, and the skills we’ve learned from particle physics in terms of cooperating and pooling resources have been incredibly important for the success of CLOUD. It’s the CERN model, the CERN culture that we’ve conveyed to another discipline. We implemented the best of CERN’s know-how in ultra-clean materials and built the cleanest atmospheric chamber in the world.

How did CLOUD get off the ground?

The idea came to me in 1997 during a lecture at CERN given by Nigel Calder, a former editor of New Scientist magazine, who pointed out a new result from satellite data about possible links between cosmic rays and cloud formation. That Christmas, while we visited relatives in Paris, I read a lot of related papers and came up with the idea to test the cosmic ray–cloud link at CERN with an experiment I named CLOUD. I did not want to ride into another field telling those guys how to do their stuff, so I wrote a note of my ideas and started to make contact with the atmospheric community in Europe and build support from lab directors in particle physics. I managed to assemble a dream team to propose the experiment to CERN. The hard part was convincing CERN that they should do this crazy experiment. We proposed it in 2000 and it was finally approved in 2006, which I think is a record for CERN to approve an experiment. There were some people in the climate community who were against the idea that cosmic rays could influence clouds. But we persevered and, once approved, things went very fast. We started taking data in 2009 and have been in discovery mode ever since.

Do you consider yourself a particle physicist or an atmospheric scientist?

An experimental physicist! My training and my love is particle physics, but judging by the papers I write and review, I am now an atmospheric scientist. It was not difficult to make this transition. It was a case of going back to my undergraduate physics and high-school chemistry and learning on the job. It’s also very rewarding. We do experiments, like we all do at CERN, on a 24/7 basis, but with CLOUD I can calculate things in my notebook and see the science that we are doing, so we know immediately what the new stuff is and we can adapt our experiments continuously during our run.

On the other hand, in particle physics the detectors are running all the time but we really don’t know what is in the data without years of very careful analysis afterwards, so there is this decoupling of the result from the actual measurement. Also, in CLOUD we don’t need a separate discipline to tell us about the underlying theory or beauty of what we are doing. In CLOUD you’re the theorist and the experimentalist at the same time – like it was in the early days of particle physics.

How would you compare the Standard Model to state-of-the-art climate models?

It’s night and day. The Standard Model (SM) is such a well formed theory and remarkably high-quality quantitatively that we can see incredibly subtle signals in detectors against a background of something that is extremely well understood. Climate models, on the other hand, are trying to simulate a very complex system about what’s happening on Earth’s surface, involving energy exchanges between the atmosphere, the oceans, the biosphere, the cryosphere … and the influence of human beings. The models involve many parameters that are poorly understood, so modellers have to make plausible yet uncertain choices. As a result, there is much more flexibility in climate models, whereas there is almost none in the SM. Unfortunately, this flexibility means that the predictive power of such models is much weaker than it is in particle physics.

The CLOUD detector

There are skills such as the handling of data, statistics and software optimisation where particle physics is probably the leading science in the world, so I would love to see CERN sponsor a workshop where the two communities could exchange ideas and perhaps even begin to collaborate. This is what CLOUD has done. It’s politically correct to talk about the power of interdisciplinary research, but it’s very difficult in practical terms – especially when it comes to funding because experiments often fall into the cracks between funding agencies.

How has CLOUDs focus evolved during a decade of running?

CLOUD was designed to explore whether variations of cosmic rays in the atmosphere affect clouds and climate, and that’s still a major goal. What I didn’t realise at the beginning is how important aerosol–particle formation is for climate and health, and just how much is not yet understood. The largest uncertainty facing predictions of global warming is not due to a lack of understanding about greenhouse gases, but about how much aerosols and clouds have increased since pre-industrial times from human activities. Aerosol changes have offset some of the warming from greenhouse gases but we don’t know by how much – it could have offset almost nothing, or as much as one half of the warming effect. Consequently, when we project forwards, we don’t know how much Earth will warm later this century to better than a factor of three.

Many of our experiments are now aimed at reducing the aerosol uncertainties in anthropogenic climate change. Since all CLOUD experiments are performed under different ionisation conditions, we are also able to quantify the effect of cosmic rays on the process under study. A third major focus concerns the formation of smog under polluted urban conditions.

What have CLOUD’s biggest contributions been?

We have made several major discoveries and it’s hard to rank them. Our latest result (CLOUD clarifies cause of urban smog) on the role of ammonia and nitric acid in urban environments is very important for human health. We have found that ammonia and nitric acid can drive the growth rates of newly formed particles up to more than 100 times faster than seen before, but only in short spurts that have previously escaped detection. This can explain the puzzling observation of bursts of new particles that form and grow under highly polluted urban conditions, producing winter smog episodes. An earlier CLOUD result, also in Nature, showed that a few parts-per-trillion of amine vapours lead to extremely rapid formation of sulphuric acid particles, limited only by the kinetic collision rate. We had a huge fight with one of the referees of this paper, who claimed that it couldn’t be atmospherically important because no-one had previously observed it. Finally, a paper appeared in Science last year showing that sulphuric acid–amine nucleation is the key process driving new particle formation in Chinese megacities.

In CLOUD youre the theorist and the experimentalist at the same time – like it was in the early days of particle physics

A big result from the point of view of climate change came in 2016 when we showed that trees alone are capable of producing abundant particles and thus cloud seeds. Prior to that it was thought that sulphuric acid was essential to form aerosol particles. Since sulphuric acid was five times lower in the pre-industrial atmosphere, climate models assumed that clouds were fewer and thinner back then. This is important because the pre-industrial era is the baseline aerosol state from which we assess anthropogenic impacts. The fact that biogenic vapours make lots of aerosols and cloud droplets reduces the contrast in cloud coverage (and thus the amount of cooling offset) between then and now. The formation rate of these pure biogenic particles is enhanced by up to a factor 100 by galactic cosmic rays, so the pristine pre-industrial atmosphere was more sensitive to cosmic rays than today’s polluted atmosphere.

There was an important result the very first week we turned on CLOUD, when we saw that sulphuric acid does not nucleate on its own but requires ammonia. Before CLOUD started, people were measuring particles but they weren’t able to measure the molecular composition, so many experiments were being fooled by unknown contaminants.

Have CLOUD results impacted climate policy?

The global climate models that inform the Intergovernmental Panel on Climate Change (IPCC) have begun to incorporate CLOUD aerosol parameterisations, and they are impacting estimates of Earth’s climate sensitivity. The IPCC assessments are hugely impressive works of the highest scientific quality. Yet, there is something of a disconnect between what climate modellers do and what we do in the experimental and observational world. The modellers tend to work in national centres and connect with experiments through the latter’s publications, at the end of the chain. I would like to see much closer linkage between the models and the measurements, as we do in particle physics where there is a fluid connection between theory, experiment and modelling. We do this already in CLOUD, where we have several institutes who are primarily working on regional and global aerosol-cloud models.

What’s next on CLOUD’s horizon?

The East Hall at the PS is being completely rebuilt during CERN’s current long shutdown, but the CLOUD chamber itself is pretty much the only item that is untouched. When the East Area is rebuilt there will be a new beamline and a new experimental zone for CLOUD. We think we have a 10-year programme ahead to address the questions we want to and to settle the cosmic ray–cloud–climate question. That will take me up to just over 80 years old!

Will humanity succeed in preventing catastrophic climate change?

I am an optimist, so I believe there is always a way out of everything. It’s very understandable that people want to freeze the exact temperature of Earth as it is now because we don’t want to see a flood or desert in our back garden. But I’m afraid that’s not how Earth is, even without the anthropogenic influence. Earth has gone through much larger natural climate oscillations, even on the recent timescale of homo sapiens. That being said, I think Earth’s climate is fundamentally stable. Oceans cover two thirds of Earth’s surface and their latent heat of vaporisation is a huge stabiliser of climate – they have never evaporated nor completely frozen over. Also, only around 2% of CO2 is in the atmosphere and most of the rest is dissolved in the oceans, so eventually, over the course of several centuries, CO2 in the atmosphere will equilibrate at near pre-industrial levels. The current warming is an important change – and some argue it could produce a climate tipping point – but Earth has gone through larger changes in the past and life has continued. So we should not be too pessimistic about Earth’s future. And we shouldn’t conflate pollution and climate change. Reducing pollution is an absolute no brainer, but environmental pollution is a separate issue from climate change and should be treated as such.

The post Lofty thinking appeared first on CERN Courier.

]]>
Opinion CERN’s CLOUD experiment has merged the best of particle physics and atmospheric science into a novel experimental approach, says spokesperson Jasper Kirkby. https://cerncourier.com/wp-content/uploads/2020/07/CCJulAug20_INT_cloud2.jpg
Surveying the surveyors https://cerncourier.com/a/surveying-the-surveyors/ Tue, 07 Jul 2020 09:10:30 +0000 https://preview-courier.web.cern.ch/?p=87759 The need at CERN to align components within a fraction of a millimetre demands skills and tools beyond the scope of normal surveyor jobs.

The post Surveying the surveyors appeared first on CERN Courier.

]]>
Alban Vieille

A career as a surveyor offers the best of two worlds, thinks Dominique Missiaen, a senior member of CERN’s survey, mechatronics and measurements (SMM) group: “I wanted to be a surveyor because I felt I would like to be inside part of the time and outside the other, though being at CERN is the opposite because the field is in the tunnels!” After qualifying as a surveyor and spending time doing metrology for a cement plant in Burma and for the Sorbonne in Paris, Missiaen arrived at CERN as a stagier in 1986. He never left, starting in a staff position working on the alignment of the pre-injector for LEP, then of LEP itself, and then leading the internal metrology of the magnets for the LHC. From 2009–2018 he was in charge of the whole survey section, and since last year has a new role as a coordinator for special projects, such as the development of a train to remotely survey the magnets in the arcs of the LHC.

“Being a surveyor at CERN is completely different to other surveying jobs,” explains Missiaen. “We are asked to align components within a couple of tenths of a millimetre, whereas in the normal world they tend to work with an accuracy of 1–2  cm, so we have to develop new and special techniques.”

A history of precision

When building the Proton Synchrotron in the 1950s, engineers needed an instrument to align components to 50 microns in the horizontal plane. A device to measure such distances did not exist on the market, so the early CERN team invented the “distinvar” – an instrument to ensure the nominal tension of an invar wire while measuring the small length to be added to obtain the distance between two points. It was still used as recently as 10 years ago, says Missiaen. Another “stretched wire” technique developed for the ISR in the 1960s and still in use today replaces small-angle measurements by a short-distance measurement: instead of measuring the angle between two directions, AB and AC, using a theodolite, it measures the distance between the point B and the line AC. The AC line is realised by a nylon wire, while the distance is measured using a device invented at CERN called the “ecartometer”.

Invention and innovation haven’t stopped. The SMM group recently adapted a metrology technique called frequency sweeping interferometry for use in a cryogenic environment to align components inside the sealed cryostats of the future High-Luminosity LHC (HL-LHC), which contract by up to 12 mm when cooled to operational temperatures. Another recent innovation, in collaboration with the Institute of Plasma Physics in Prague that came about while developing the challenging alignment system for HIE-ISOLDE, is a non-diffractive laser beam with a central axis that diverges by just a few mm over distances of several hundred metres and which can “reconstruct” itself after meeting an obstacle.

The specialised nature of surveying at CERN means the team has to spend a lot of time finding the right people and training newcomers. “It’s hard to measure at this level and to maintain the accuracy over long distances, so when we recruit we look for people who have a feeling for this level of precision,” says Missiaen, adding that a constant feed of students is important. “Every year I go back to my engineering school and give a talk about metrology, geodesy and topometry at CERN so that the students understand there is something special they can do in their career. Some are not interested at all, while others are very interested – I never find students in between!”

We see the physics results as a success that we share in too

CERN’s SMM group has more than 120 people, with around 35 staff members. Contractors push the numbers up further during periods such as the current long-shutdown two (LS2), during which the group is tasked with measuring all the components of the LHC in the radial and vertical direction. “It takes two years,” says Jean-Frederic Fuchs, who is section leader for accelerators, survey and geodesy. “During a technical stop, we are in charge of the 3D-position determination of the components in the tunnels and their alignment at the level of a few tenths of a millimetre. There is a huge number of various accelerator elements along the 63 km of beam lines at CERN.”

Fuchs did his master’s thesis at CERN in the domain of photogrammetry and then left to work in Portugal, where he was in charge of guiding a tunnel-boring machine for a railway project. He returned to CERN in the early 2000s as a fellow, followed by a position as a project associate working on the assembly and alignment of the CMS experiment. He then left to join EDF where he worked on metrology inside nuclear power plants, finally returning to CERN as a staff member in 2011 working on accelerator alignment. “I too sought a career in which I didn’t have to spend too much time in the office. I also liked the balance between measurements and calculations. Using theodolites and other equipment to get the data is just one aspect of a surveyor’s job – post-treatment of the data and planning for measurement campaigns is also a big part of what we do.”

With experience in both experiment and accelerator alignment, Fuchs knows all too well the importance of surveying at CERN. Some areas of the LHC tunnel are moving by about 1 mm per year due to underground movement inside the rock. The tunnel is rising at point 5 (where CMS is located) and falling between P7 and P8, near ATLAS, while the huge mass of the LHC experiments largely keeps them at the same vertical position, therefore requiring significant realignment of the LHC magnets. During LS2, the SMM group plans to lower the LHC at point 5 by 3 mm to better match the CMS interaction point by adjusting jacks that allow the LHC to be raised or lowered by around 20 mm in each direction. For newer installations, the movement can be much greater. For example, LINAC4 has moved up by 5 mm in the source area, leading to a slope that must be corrected. The new beam-dump tunnels in the LHC and the freshly excavated HL-LHC tunnels in points 1 and 5 are also moving slightly  compared to the main LHC tunnel. “Today we almost know all the places where it moves,” says Fuchs. “For sure, if you want to run the LHC for another 18 years there will be a lot of measurement and realignment work to be done.” His team also works closely with machine physicists to compare its measurements to those performed with the beams themselves.

It is clear that CERN’s accelerator infrastructure could not function at the level it does without the field and office work of surveyors. “We see the physics results as a success that we share in too,” says Missiaen. “When the LHC turned on you couldn’t know if a mistake had been made somewhere, so in seeing the beam go from one point to another, we take pride that we have made that possible.”

The post Surveying the surveyors appeared first on CERN Courier.

]]>
Careers The need at CERN to align components within a fraction of a millimetre demands skills and tools beyond the scope of normal surveyor jobs. https://cerncourier.com/wp-content/uploads/2020/07/CCJulAug20_Careers_frontis.jpg
CERN trials graphene for magnetic measurements https://cerncourier.com/a/graphene-trialled-for-magnetic-measurements/ Tue, 16 Jun 2020 16:49:15 +0000 https://preview-courier.web.cern.ch/?p=87568 With atomically thin active sensing components, graphene-based Hall probes incur reduced systematic errors.

The post CERN trials graphene for magnetic measurements appeared first on CERN Courier.

]]>
First isolated in 2004 by physicists at the University of Manchester using pieces of sticky tape and a graphite block, the one-atom-thick carbon allotrope graphene has been touted as a wonder material on account of its exceptional electrical, thermal and physical properties. Turning these properties into scalable commercial devices has proved challenging, however, which makes a recently agreed collaboration between CERN and UK firm Paragraf on graphene-based Hall-probe sensors especially novel.

There is probably no other facility in the world to be able to confirm this, so the project has been a big win on both sides

Ellie Galanis

With particle accelerators requiring large numbers of normal and superconducting magnets, high-precision and reliable magnetic measurements are essential. While the workhorse for these measurements is the rotating-coil magnetometer with a resolution limit of the order of 10–8 Vs, the most important tool for local field mapping is the Hall probe, which passes electrical current proportional to the field strength when the sensor is perpendicular to a magnetic field. However, measurement uncertainties in the 10–4 range required for determining field multipoles are difficult to obtain, even with the state-of-the-art devices. False signals caused by non-perpendicular field components in the three-dimensional sensing region of existing Hall probes can increase the measurement uncertainty, requiring complex and time-consuming calibration and processing to separate true signals from systematic errors. With an active sensing component made of atomically thin graphene, which is effectively two-dimensional, a graphene-based Hall probe in principle suffers negligible planar Hall effects and therefore could enable higher precision mapping of local magnetic fields.

Inspiration strikes

Stephan Russenschuck, head of the magnetic measurement section at CERN, spotted the potential of graphene-based Hall probes when he heard about a talk given by Paragraf – a recent spin-out from the department of materials science at the University of Cambridge – at a magnetic measurement conference in December 2018. This led to a collaboration, formalised between CERN and Paragraf in April, which has seen several graphene sensors installed and tested at CERN during the past year. The firm sought to develop and test the device ahead of a full product launch by the end of this year, and the results so far, based on well-calibrated field measurements in CERN’s reference magnets, have been very promising. “The collaboration has proved that the sensor has no planar effect,” says Paragraf’s Ellie Galanis. “This was a learning step. There is probably no other facility in the world to be able to confirm this, so the project has been a big win on both sides.”

Graphene sensor

The graphene Hall sensor also operates over a wide temperature range, down to liquid-helium temperatures at which superconducting magnets in the LHC operate. “How these sensors behave at cryogenic temperatures is very interesting,” says Russenschuck. “Usually the operation of Hall sensors at cryogenic temperatures requires careful calibration and in situ cross-calibration with fluxmetric methods. Moreover, we are now exploring the sensors on a rotating shaft, which could be a breakthrough for extracting local, transversal field harmonics. Graphene sensors could get rid of the spurious modes that come from nonlinearities and planar effects.”

CERN and Paragraf, which has patented a scalable process for depositing two-dimensional materials directly onto semiconductor-compatible substrates, plan to release a joint white paper communicating the results so far and detailing the sensor’s performance across a range of magnetic fields.

The post CERN trials graphene for magnetic measurements appeared first on CERN Courier.

]]>
News With atomically thin active sensing components, graphene-based Hall probes incur reduced systematic errors. https://cerncourier.com/wp-content/uploads/2020/06/Graphene-tests-at-CERN.jpg
IPAC goes virtual https://cerncourier.com/a/ipac-goes-virtual/ Mon, 08 Jun 2020 12:30:13 +0000 https://preview-courier.web.cern.ch/?p=87539 3000 accelerator specialists gathered in cyber-space for the 11th International Particle Accelerator Conference.

The post IPAC goes virtual appeared first on CERN Courier.

]]>
More than 3000 accelerator specialists gathered in cyber-space from 11 to 14 May for the 11th International Particle Accelerator Conference (IPAC). The conference was originally destined for the GANIL laboratory in Caen, a charming city in Normandy, and host to the flagship radioactive-ion-beam facility SPIRAL-2, but the coronavirus pandemic forced the cancellation of the in-person meeting and the French institutes CNRS/IN2P3, CEA/IRFU, GANIL, Soleil and ESRF agreed to organise a virtual conference. Oral presentations and the accelerator-prize session were maintained, though unfortunately the poster and industry sessions had to be cancelled. The scientific programme committee whittled down more than 2000 proposals for talks into 77 presentations which garnered more than 43,000 video views across 60 countries, making IPAC’20 an involuntary pioneer of virtual conferencing and a lighthouse of science during the lockdown.

Recent trends indicate a move towards the use of permanent magnets

IPAC’20’s success relied on a programme of recent technical highlights, new developments and future plans in the accelerator world. Weighing in at 1,998 views, the most popular talk of the conference was by Ben Shepherd from STFC’s Daresbury Laboratory in the UK, who spoke on high-technology permanent magnets. Accelerators do not only accelerate ensembles of particles, but also use strong magnetic fields to guide and focus them into very small volumes, typically just micro or nanometres in size. Recent trends indicate a move towards the use of permanent magnets that provide strong fields but do not require external power, and can provide outstanding field quality. Describing the major advances for permanent magnets in terms of production, radiation resistance, tolerances and field tuning, Shepherd presented high tech devices developed and used for the SIRIUS, ESRF-EBS, SPRING-8, CBETA, SOLEIL and CUBE-ECRIS facilities, and also presented the Zero-Power Tunable Optics (ZEPTO) collaboration between STFC and CERN, which offers 15 – 60 T/m tunability in quadrupoles and 0.46 – 1.1 T in dipoles.

Top of the talks

The seven IPAC’20 presentations with the most views included four by outstanding female scientists. CERN Director General Fabiola Gianotti presented strategic considerations for future accelerator-based particle physics. While pointing out the importance of Europe participating in projects elsewhere in the world, she made the strong point that CERN should host an ambitious future collider, and discussed the options being considered, pointing to the update of the European Strategy for Particle Physics soon to be approved by the CERN Council. Sarah Cousineau from Oakridge reported on accelerator R&D as a driver for science in general, pointing out that accelerators have directly contributed to more than 25 Nobel Prizes, including the Higgs-boson discovery at the LHC in 2012. The development of superconducting accelerator technology has enabled projects for colliders, photon science, nuclear physics and neutron spallation sources around the world, with several light sources and neutron facilities currently engaged in COVID-19 studies.

SPIRAL-2 will explore exotic nuclei near the limits of the periodic table

The benefits of accelerator-based photon science for society was also emphasized by Jerry Hastings from Stanford University and SLAC, who presented the tremendous progress in structural biology driven by accelerator-based X-ray sources, and noted that research can be continued during COVID-19 times thanks to the remote synchrotron access pioneered at SSRL. Stressing the value of international collaboration, Hastings presented the outcome of an international X-ray facilities meeting that took place in April and defined an action plan for ensuring the best possible support to COVID-19 research. GANIL Director Alahari Navin presented new horizons in nuclear science, reviewing facilities around the world and presenting his own laboratory’s latest activities. GANIL has now started commissioning SPIRAL-2, which will allow users to explore the as-yet unknown properties of exotic nuclei near the limits of the periodic table of elements, and has performed its initial science experiment. Liu Lin from LNLS in Brazil presented the commissioning results for the new 4th generation SIRIUS light source, showing that the functionality of the facility has already been demonstrated by storing 15 mA of beam current. Last, but not least in the top-seven most-viewed talks, Anke-Susanne Müller from KIT presented the status of the study for a 100 km Future Circular Collider – just one of the options for an ambitious post-LHC project at CERN.

Many other highlights from the accelerator field were presented during IPAC’20. Kyo Shibata (KEK) discussed the progress in physics data-taking at the SuperKEKb factory, where the BELLE II experiment recently reported its first result. Ferdinand Willeke (BNL) presented the electron-ion collider approved to be built at BNL, Porntip Sudmuang (SLRI) showed construction plans for a new light source in Thailand, and Mohammed Eshraqi (ESS) discussed the construction of the European Spallation Source in Sweden. At the research frontier towards compact accelerators, Chang Hee Nam (IBS, Korea) explained prospects for laser-driven GeV-electron beams from plasma-wakefield accelerators and Arnd Specka (LLR/CNRS) showed plans for compact European plasma-accelerator facility EuPRAXIA, which is entering its next phase after successful completion of a conceptual-design report. The accelerator-application session rounded the picture off with presentations by Annalisa Patriarca (Institute Curie) about accelerator challenges in a new radiation-therapy technique called FLASH, in which ultra-fast delivery of radiation dose reduces damage to healthy tissue, by Charlotte Duchemin (CERN) on the production of non-conventional radionuclides for medical research at the MEDICIS hadron beam facility, by Toms Torims (Riga Technical University) on the treatment of marine exhaust gases using electron beams and by Adrian Fabich (SCK-CEN) on proton-driven nuclear-waste transmutation.

To the credit of the French organisers, the virtual setup worked seamlessly. The concept relied on pre-recorded presentations and a text-driven chat function which allowed registered participants to participate from time zones across the world. Activating the sessions in half-day steps preserved the appearance of live presentations to some degree, before a final live session, during which the four prizes of the accelerator group of the European Physical Society were awarded.

The post IPAC goes virtual appeared first on CERN Courier.

]]>
Meeting report 3000 accelerator specialists gathered in cyber-space for the 11th International Particle Accelerator Conference. https://cerncourier.com/wp-content/uploads/2020/06/IPAC-1000.jpg
Bridging Europe’s neutron gap https://cerncourier.com/a/bridging-europes-neutron-gap/ Fri, 24 Apr 2020 16:02:13 +0000 https://preview-courier.web.cern.ch/?p=87241 The recent closure of reactors means making the most of existing facilities while preparing accelerator-based sources, says Helmut Schober.

The post Bridging Europe’s neutron gap appeared first on CERN Courier.

]]>
The Institut Laue-Langevin

In increasing its focus towards averting environmental disaster and maintaining economic competitiveness, both the European Union and national governments are looking towards green technologies, such as materials for sustainable energy production and storage. Such ambitions rely on our ability to innovate – powered by Europe’s highly developed academic network and research infrastructures.

Neutron science holds enormous potential at every stage of innovation

Europe is home to world-leading neutron facilities that each year are used by more than 5000 researchers across all fields of science. Studies range from the dynamics of lithium-ion batteries, to developing medicines against viral diseases, in addition to fundamental studies such as measurements of the neutron electric-dipole moment. Neutron science holds enormous potential at every stage of innovation, from basic research through to commercialisation, with at least 50% of publications globally attributed to European researchers. Yet, just as the demand for neutron science is growing, access to facilities is being challenged.

Helmut Schober

Three of Europe’s neutron facilities closed in 2019: BER II in Berlin; Orphée in Paris; and JEEP II outside Oslo. The rationale is specific to each case. There are lifespan considerations due to financial resources, but also political considerations when it comes to nuclear installations. The potentially negative consequences of these closures must be carefully managed to ensure expertise is maintained and communities are not left stranded. This constitutes a real challenge for the remaining facilities. Sharing the load via strategic collaboration is indispensable, and is the motivation behind the recently created League of advanced European Neutron Sources (LENS).

We must also ensure that the remaining facilities – which include the FRM II in Munich, the Institut Laue-Langevin (ILL) in France, ISIS in the UK and the SINQ facility in Switzerland – are fully exploited. These facilities have been upgraded in recent years, but their long-term viability must be secured. This is not to be underestimated. For example, 20% of the ILL’s budget relies on the contributions of 10 scientific members that must be renegotiated every five years. The rest is provided by the ILL’s three associate countries (France, Germany and the UK). The loss of one of its major scientific members, even only partially, would severely threaten the ILL’s upgrade capacity.

Accelerator sources

The European Spallation Source (ESS) under construction in Sweden, which was conceived more than 20 years ago, must become a fully operating neutron facility at the earliest possible date. This was initially foreseen for 2019, now scheduled for 2023. Europe must ask itself why building large scientific facilities such as ESS, or FAIR in Germany, takes so long, despite significant strategic planning (e.g. via ESFRI) and sophisticated project management. After all, neutron-science pioneers built the original ILL in just over four years, though admittedly at a time of less regulatory pressure. We must regain that agility. The Chinese Spallation Neutron Source has just reached its design goal of 100 kW, and the Spallation Neutron Source in Oak Ridge, Tennessee, is actively pursuing plans for a second target station.

The value of neutron science will be judged on its contribution to solving society’s problems

We therefore need to look to next-generation sources such as Compact Accelerator driven Neutron Sources (CANS). Contrary to spallation sources that produce neutrons by bombarding heavy nuclei with high-energy protons, CANS rely on nuclear processes that can be triggered by proton bombardment in the 5 to 50 MeV range. While these processes are less efficient than spallation, they allow for a more compact target and moderator design. Examples of this scheme are SONATE, currently under development at CEA-Saclay and the High Brilliance Source being pursued at Jülich. CANS must now be brought to maturity, requiring carefully planned business models to identify how they can best reinforce the ecosystem of neutron science.

It is also important to begin strategic discussions that aim beyond 2030, including the need for powerful new national sources that will complement the ESS. Continuous (reactor) neutron sources must be part of this because many applications, such as the production of neutron-rich isotopes for medical purposes, require the highest time-averaged neutron flux. Such a strategic evaluation is currently under way in the US, and Europe should soon follow suit.

Despite last year’s reactor closures, Europe is well prepared for the next decade thanks to the continuous modernisation of existing sources and investment in the ESS. The value of neutron science will be judged on its contribution to solving society’s problems, and I am convinced that European researchers will rise to the challenge and carve a route to a greener future through world-leading neutron science.

The post Bridging Europe’s neutron gap appeared first on CERN Courier.

]]>
Opinion The recent closure of reactors means making the most of existing facilities while preparing accelerator-based sources, says Helmut Schober. https://cerncourier.com/wp-content/uploads/2020/04/si-eurofacilitiesREV.jpg
Particle physicists propose stripped-down ventilator to help combat COVID-19 https://cerncourier.com/a/particle-physicists-propose-stripped-down-ventilator-to-help-combat-covid-19/ Fri, 03 Apr 2020 19:01:24 +0000 https://preview-courier.web.cern.ch/?p=87070 The proposal, led by physicists and engineers from LHCb, is one of several being developed to meet growing demand for ventilators.

The post Particle physicists propose stripped-down ventilator to help combat COVID-19 appeared first on CERN Courier.

]]>
A preliminary CAD model of the HEV unit. Credit: HEV Collaboration.

As part of the global response to the COVID-19 pandemic, a team led by physicists and engineers from the LHCb collaboration has proposed a design for a novel ventilator. The High Energy Ventilator (HEV) is based on components which are simple and cheap to source, complies with hospital standards, and supports the most requested ventilator-operation modes, writes the newly formed HEV collaboration. Though the system needs to be verified by medical experts before it can enter use, in the interests of rapid development the HEV team has presented the design to generate feedback, corrections and support as the project progresses. The proposal is one of several recent and rapidly developing efforts launched by high-energy physicists to help combat COVID-19.

The majority of people who contract COVID-19 suffer mild symptoms, but in some cases the disease can cause severe breathing difficulties and pneumonia. For such patients, the availability of ventilators that deliver oxygen to the lungs while removing carbon dioxide could be the difference between life and death. Even with existing ventilator suppliers ramping up production, the rapid rise in COVID-19 infections is causing a global shortage of devices. Multiple efforts are therefore being mounted by governments, industry and academia to meet the demand, with firms which normally operate in completely different sectors – such as Dyson and General Motors – diverting resources to the task.

There are many proposals on the market, but we don’t know now which ones will in the end make a difference, so everything which could be viable should be pursued

Paula Collins

HEV was born out of discussions in the LHCb VELO group, when lead-designer Jan Buytaert (CERN) realised that the systems which are routinely used to supply and control gas at desired temperatures and pressures in particle-physics detectors are well matched to the techniques required to build and operate a ventilator. The team started from a set of guidelines recently drawn up by the UK government’s Medicines and Healthcare products Regulatory Agency regarding rapidly manufactured ventilator systems, and was encouraged by a 3D-printed prototype constructed at the University of Liverpool in response to these guidelines. The driving pressure of ventilators — which must be able to handle situations of rapidly changing lung compliance, and potential collapse and consolidation — is a crucial factor for patient outcomes. The HEV team therefore aimed to produce a patient-safety-first design with a gentle and precise pressure control that is responsive to the needs of the patient, and which offers internationally recommended operation modes.

As the HEV team comprises physicists, not medics, explains HEV collaborator Paula Collins of CERN, it was vital to get the relevant input from the very start. “Here we have benefitted enormously from the experience and knowledge of CERN’s HSE [occupational health & safety and environmental protection] group for medical advice, conformity with applicable legislation and health-and-safety requirements, and the working relationship with local hospitals. The team is also greatly supported from other CERN departments, in particular for electronic design and the selection of the best components for gas manipulation. During lockdown, the world is turning to remote connection, and we were very encouraged to find that it was possible in a short space of time to set up an online chat group of experienced anesthesiologists and respiratory experts from Australia, Belgium, Switzerland and Germany, which sped up the design considerably.”

Prototyping the HEV buffer-concept at CERN to demonstrate “breathing” and flow capabilities of the device. The demonstrator is built with in-house parts and looks mechanically very different to the final system. Control is provided via LabView, whereas the final system will use an embedded controller. Credit: HEV Collaboration.
Conceptual design of the HEV ventilator. Credit: HEV Collaboration.

Stripped-down approach
The HEV concept relies on easy-to-source components, which include electro-valves, a two-litre buffer container, a pressure regulator and several pressure sensors. Embedded components — currently Arduino and Rasbperry Pi — are being used to address portability requirements. The unit’s functionality will be comprehensive enough to provide long-term support to patients in the initial or recovery phases, or with more mild symptoms, freeing up high-end machines for the most serious intensive care, explains Collins: “It will incorporate touchscreen control intuitive to use for qualified medical personnel, even if they are not specialists in ventilator use, and it will include extensive monitoring and failsafe mechanisms based on CERN’s long experience in this area, with online training to be developed.”

The first stage of prototyping, which was achieved at CERN on 27 March, was to demonstrate that the HEV working principle is sound and allows the ventilator to operate within the required ranges of pressure and time. The desired physical characteristics of the pressure regulators, valves and pressure sensors are now being refined, and the support of clinicians and international organisations is being harnessed for further prototyping and deployment stages. “This is a device which has patient safety as a major priority,” says HEV collaborator Themis Bowcock of the University of Liverpool. “It is aimed at deployment round the world, also in places that do not necessarily have state-of-the-art facilities.”

Complementary designs
The HEV concept complements another recent ventilator proposal, initiated by physicists in the Global Argon Dark Matter Collaboration. The Mechanical Ventilator Milano (MVM) is optimised to permit large-scale production in a short amount of time and at a limited cost, also relying on off-the-shelf components that are readily available. In contrast to the HEV design, which aims to control pressure by alternately filling and emptying a buffer, the MVM project regulates the flow of the incoming mixture of oxygen and air via electrically controlled valves. The proposal stems from a cooperation of particle- and nuclear-physics laboratories and universities in Canada, Italy and the US, with an initial goal to produce up to 1000 units in each of the three countries while the interim certification process is ongoing. Clinical requirements are being developed with medical experts, and detailed testing and qualification of the first prototype is presently underway with a breathing simulator at Ospedale San Gerardo in Monza, Italy.

Sharing several common ideas with the MVM principle, but with emphasis on further reducing the number and specificity of components to make construction possible during times of logistical disruption, a team led by particle physicists at the Laboratory of Instrumentation and Experimental Particles Physics in Portugal has also posted a proof-of-concept study for a ventilator on arXiv. All ventilator designs are evolving quickly and require further development before they can be deployed in hospitals.

“It is difficult to conceive a project which goes all the way and includes all the bells and whistles needed to get it into the hospital, but this is our firm goal,” says Collins. “After one week we had a functioning demonstrator, after two weeks we aim to test on a medical mechanical lung and to start prototyping in the hospital context. We find ourselves in a unique and urgent situation where there are many proposals on the market, but we don’t know now which ones will in the end make a difference, so everything which could be viable should be pursued.”

The post Particle physicists propose stripped-down ventilator to help combat COVID-19 appeared first on CERN Courier.

]]>
News The proposal, led by physicists and engineers from LHCb, is one of several being developed to meet growing demand for ventilators. https://cerncourier.com/wp-content/uploads/2020/04/HEV_humans-WEB.jpg
Synchrotrons on the coronavirus frontline https://cerncourier.com/a/synchrotrons-on-the-coronavirus-frontline/ Tue, 24 Mar 2020 16:42:41 +0000 https://preview-courier.web.cern.ch/?p=86906 Researchers at Diamond Light Source and other synchrotron facilities are using intense X-rays to decipher the biochemical and structural makeup of the SARS-CoV-2 virus.

The post Synchrotrons on the coronavirus frontline appeared first on CERN Courier.

]]>
Representation of the 3D structure of the main SARS-CoV-2 protease, obtained using Diamond Light Source. The coils represent “alpha” helices and the flatter arrows are “beta sheets”, with loops connecting them together. The organisation of alpha helices and beta sheets is often referred to as the secondary structure of the protein (with the primary sequence being the amino acid sequence and the tertiary structure being the overall 3D shape of the protein). Credit: D Owen/Diamond Light Source.

At a time when many countries are locking down borders, limiting public gatherings, and encouraging isolation, the Diamond Light Source in Oxfordshire, UK, has been ramping up its intensity, albeit in an organised and controlled manner. The reason: these scientists are working tirelessly on drug-discovery efforts to quell COVID-19.

It is a story that requires fast detectors, reliable robotics and powerful computing infrastructures, artificial intelligence, and one of the brightest X-ray sources in the world. And it is made possible by international collaboration, dedication, determination and perseverance.

Synchrotron light sources are particle accelerators capable of producing incredibly bright X-rays, by forcing relativistic electrons to accelerate on curved trajectories. Around 50 facilities exist worldwide, enabling studies over a vast range of topics. Fanning out tangentially from Diamond’s 562-m circumference storage ring are more than 30 beamlines equipped with instrumentation to serve a multitude of user experiments. The intensely bright X-rays (corresponding to flux of around 9 × 1012 photons per second) are necessary for determining the atomic structure of proteins, including the proteins which make up viruses. As such, synchrotron light sources around the world are interrupting their usual operations to work on mapping the structure of the SARS-CoV-2 virus.

Knowing the atomic structure of the virus is like knowing how the enemy thinks

Knowing the atomic structure of the virus is like knowing how the enemy thinks. A 3D visualisation of the building blocks of the structure at an atomic level would allow scientists to understand how the virus functions. Enzymes, the molecular machines that allow the virus to replicate, are key to this process. Scientists at Diamond are exploring the binding site of the main SARS-CoV-2 protease. A drug that binds to this enzyme’s active site would throw a chemical spanner in the works, blocking the virus’ ability to replicate and limiting the spread of the disease.

By way of reminder: Coronavirus is the family of viruses responsible for the common cold, MERS, SARS, etc. Novel coronavirus, aka SARS-CoV-2, is the newly discovered type of coronavirus, and COVID-19 is the disease which it causes.

Call to arms

On 26 January, Diamond’s life-sciences director, Dave Stuart, received a phone call from structural biologist Zihe Rao of ShanghaiTech University in China. Rao, along with his colleague Haitao Yang, had solved the structure of the main SARS-CoV-2 protease with a covalent inhibitor using the Shanghai Synchrotron Radiation Facility (SSRF) in China. Furthermore, they had made the solution freely and publicly available on the worldwide Protein Data Bank.

During the phone call, Rao informed Stuart that their work had been halted by a scheduled shutdown of the SSRF. The Diamond team rapidly mobilised. Since shipping biological samples from Shanghai at the height of the coronavirus in China was expected to be problematic, the team at Diamond ordered the synthetic gene. A synthetic gene can be generated provided the ordering of T, A, C and G nucleotides in the DNA sequence is known. That synthetic gene can be genetically engineered into a bacterium, in this case Escherichia. coli, which reads the sequence and generates the coronavirus protease in large enough quantities for the researchers at Diamond to determine its structure and screen for potential inhibitors.

Eleven days later on 10 February, the synthetic gene arrived. At this point, Martin Walsh, Diamond’s deputy director of life sciences, and his team (consisting of Claire Strain-Damerell, Petra Lukacik, and David Owen) dropped everything. With the gene in hand, the group immediately set up experimental trials to try to generate protein crystals. In order to determine the atomic structure, they needed a crystal containing millions of proteins in an ordered grid-like structure.

Diamond Light Source, the UK

X-ray radiation bright enough for the rapid analysis of protein structures can only be produced by a synchrotron light source. The X-rays are directed and focused down a beamline onto a crystal and, as they pass through it, they diffract. From the diffraction pattern, researchers can work backwards to determine the 3D electron density maps and the structure of the protein. The result is a complex curled ribbon-like structure with an intricate mess of twists and turns of the protein chain.

The Diamond team set up numerous trials trying to find the optimum conditions for crystallization of the SARS-CoV-2 protease to occur. They modified the pH, the precipitating compounds, chemical composition, protein to solution ratio… every parameter they could vary, they did. Every day they would produce a few thousand trials, of which only a few hundred would produce crystals, and even fewer would produce crystals of sufficient quality. Within a few days of receiving the gene, the first crystals were being produced. They were paltry and thin crystals but large enough to be tested on one of Diamond’s macromolecular crystallography beamlines.

Watching the results come through, Diamond postdoc David Owen described it as the first moment of intense excitement. With crystals that appeared to be “flat like a car wind shield,” he was dubious as to whether they would diffract at all. Nevertheless, the team placed the crystals in the beamline with a resignation that quickly turned into intense curiosity as the results started appearing before them. At that moment Owen remembers his doubts fading, as he thought, “this might just work!” And work it did. In fact, Owen recalls, “they diffracted beautifully.” These first diffraction patterns of the SARS-CoV-2 virus were recorded with a resolution of 1.9 Angstrom (1.9 × 10−10 m) — high enough resolution to see the position of all of the chemical groups that allow the protease to do its work.

By 19 February, through constant adjustments and learning, the team knew they could grow good-quality crystals quickly. It was time to bring in more colleagues. The XChem team at Diamond joined the mission to set up fragment-based screening – whereby a vast library of small molecules (“fragments”) are soaked into crystals of the viral protease. These fragments are significantly smaller and functionally simpler than most drug molecules and are a powerful approach to selecting candidates for early drug discovery. By 26 February, 600 crystals had been mounted and the first fragment screen launched. In parallel, the team had been making a series of sample to send to company in Oxford called Exscientia, which has set up an AI platform designed to expediate candidates in drug discovery.

Drug-discovery potential

As of early March, 1500 crystals and fragments have been analysed. Owen attributes the team’s success so far to the incredible amounts of data they could collect and analyse quickly. With huge numbers of data sets, they could pin down the parameters of the viral protease with a high degree of confidence. And with the synchrotron light source they were able to create and analyse the diffraction patterns rapidly. The same amount of data collected with a lab-based X-ray source would have taken approximately 10 years. At Diamond, they were able to collect the data in a few days of accumulated beamtime.

A close up view of some residues in the active site of the protein, where the sticks represent the protein molecules and the mesh represents the electron density. Credit: D Owen/Diamond Light Source.

Synchrotron light sources all over the world have been granting priority and rapid access to researchers to support their efforts in discovering more about the virus. Researchers at the Advanced Photon Source in Argonne, US, and at Elettra Sincrotrone in Trieste, Italy are also trying to identify molecules effective against COVID-19, in an attempt to bring us closer to an effective vaccine or treatment. This week, the ESRF in Grenoble, France, announced that it will make its cryo-electron microscope facility available for use. The community has a platform called www.lightsources.org offering an overview of access and calls for proposals.

Synchrotron light sources all over the world have been granting priority and rapid access

In addition to allowing the structure of tens of thousands of biological structures to be elucidated – such as that of the ribosome, which was recognised by the 2009 Nobel Prize in Chemistry — light sources have a strong pedigree in elucidating the structure of viruses. Development of common anti-viral medication that blocks the actions of virus in the body, such as Tamiflu or Relenza, also relied upon synchrotrons to reveal their atomic structure.

Mapping the SARS-CoV-2 protease structures bound to small chemical fragments, the Diamond team demonstrated a crystallography- and fragmentation-screen tour de force. The resulting and ongoing work is a crucial first step in developing a drug. Forgoing the usual academic root of peer-review, the Diamond team have made all of their results openly and freely available to help inform public heath response, limit the spread of the virus with the hope that this can fast-track effective treatment options.

The post Synchrotrons on the coronavirus frontline appeared first on CERN Courier.

]]>
Feature Researchers at Diamond Light Source and other synchrotron facilities are using intense X-rays to decipher the biochemical and structural makeup of the SARS-CoV-2 virus. https://cerncourier.com/wp-content/uploads/2020/03/SARS-Cov-2-DLS-struture.png
A unique exercise in scientific diplomacy https://cerncourier.com/a/a-unique-exercise-in-scientific-diplomacy/ Mon, 23 Mar 2020 18:15:33 +0000 https://preview-courier.web.cern.ch/?p=86875 Michel Claessens’ new book ITER: The Giant Fusion Reactor unfolds 40 years of the history of nuclear fusion, says Lucio Rossi.

The post A unique exercise in scientific diplomacy appeared first on CERN Courier.

]]>
The International Thermonuclear Experimental Reactor — now simply ITER — is a unique exercise in scientific diplomacy, and a politically driven project. It is also the largest international collaboration, and a milestone in the technological history of mankind. These, I would say, are the main conclusions of Michel Claessens’ new book ITER: The Giant Fusion Reactor. He unfolds a fascinating story which criss-crosses more than 40 years of the history of nuclear fusion in a simple, but not simplistic, way which is accessible to anyone with a will to stick to facts without prejudices. The full range of opinions on ITER’s controversial benefits and detriments are exposed and discussed in a fair way, and the author never hides his personal connection to the project as its head of communications for many years.

ITER Claessens cover

Why don’t we more resolutely pursue a technology that could contribute to the production of carbon-free energy? ITER’s path has been plagued by rivalries between strong personalities, and difficult technical and political decisions, though, in retrospect, few domains of science and technology have received such strong and continuous support from governments and agencies. Claessens’ book begins by discussing the need for fusion among other energy sources — he avoids selling fusion as the “unique and final” solution to energy problems — and quickly brings us to the heart of a key problem humanity is facing today. Travelling through history, the author shows that when politicians take decisions of high inspiration, as at the famous fireside summit between presidents Reagan and Gorbachev in Geneva in November 1985, where the idea for a collaborative project to develop fusion energy for peaceful purposes was born, they change the course of history — for the better! The book then goes through the difficulties of setting up a complex project animated by a political agenda (fusion was on the agenda of political summits between the USA and the USSR since the cold war) without a large laboratory backing it up.

The author shows that when politicians take decisions of high inspiration they change the course of history

Progress with ITER was made more difficult by a complex system of in-kind contributions which were not optimised for cost or technical success, but for political “return” to each member state of ITER (Europe, China, Japan, Russia, South Korea, the US, and most recently India). Claessens’ examples are striking, and he doesn’t skirt around the inevitable hot questions: what is the real cost of ITER? Will it even be finished given its multiple delays? How much of these extra costs and delays are due to the complex and politically oriented governance structures established by the partners? The answers are clear, honestly reported, and quantitative, though the author makes it clear that the numbers should be taken cum grano salis. Assessing the cost of a project where 90% of the components are in-kind contributions, with each partner having its own accounting structures, and in certain cases no desire to reveal the real cost, is a doubtful enterprise. However, we can say with some certainty that ITER is taking twice as long and likely costing more than double what was initially planned — and as the author says on more than one occasion, further delays will likely entail additional costs. By comparison, the LHC needed roughly an additional 25% in both budget and time compared to what was initially planned.

Price tag

Was the initial cost estimate for ITER simply too low, perhaps to help the project get approved, or would a better management, with a different governance structure, have performed better? Significantly, I have not met a single knowledgeable person who did not strongly express that ITER is a textbook case of bad management organisation, though in my opinion the book does not do justice to the energetic action of the current director general, Bernard Bigot. His directorate has been a turning point in ITER’s construction, and has set the project back on track in a moment of real crisis when many scientists and mangers expected the project to fail. A key question surfaces in the book: is the price tag important? ITER’s cost is peanuts compared to the EU’s budget, for example, and the cost is not significant by comparison to the promise it delivers: carbon-free energy in large quantities, at an affordable cost to environment, and based on widely distributed fuel.

Michel Claessens’ book explores different points of view without fanaticism

Though there is almost no intrinsic innovation in ITER, Claessens shows how the project has nevertheless pushed tokamak technology beyond its apparent limits by a sheer increase in size, though he neglects some key points, such as the incredible stored energy of the superconducting magnets. An incident similar to that suffered by the LHC in 2008 would be a logistical nightmare for ITER, as it contains more than three times the stored energy of the entire LHC and its detectors in an incomparably smaller volume. Comparisons with CERN are however a feature throughout the book, and a point of pride for high-energy physicists — clearly, CERN has set the standard for high-tech international collaboration, and ITER has tried to follow its example (CERN Courier October 2014 p45). Having begun my career as a plasma scientist, before turning to accelerators at the beginning of the 1980s, I know some of the stories and personalities involved, including CERN’s former Director General, and recognised father of ITER, Robert Aymar, and ITER’s head of superconductor procurement, my close friend Arnaud Devred, also now of CERN.

I recommend Michel Claessens’ well written and easy-to-read book. It is passionate and informative and explores different points of view without fanaticism. Interestingly, his conclusion is not scientific or political, but socio-philosophical in nature: ITER will be built because it can be, he says, according to a principle of “technological necessity”.

The post A unique exercise in scientific diplomacy appeared first on CERN Courier.

]]>
Review Michel Claessens’ new book ITER: The Giant Fusion Reactor unfolds 40 years of the history of nuclear fusion, says Lucio Rossi. https://cerncourier.com/wp-content/uploads/2020/03/lid_removal_day-2_6b-1000.jpg
Protons herald new cardiac treatment https://cerncourier.com/a/protons-herald-new-cardiac-treatment/ Sat, 21 Mar 2020 11:06:26 +0000 https://preview-courier.web.cern.ch/?p=86672 In a clinical world-first, a proton beam has been used to treat a patient with a ventricular tachycardia, which causes unsynchronised electrical impulses that prevent the heart from pumping blood.

The post Protons herald new cardiac treatment appeared first on CERN Courier.

]]>
The 80 m-circumference synchrotron at CNAO

In a clinical world-first, a proton beam has been used to treat a patient with a ventricular tachycardia, which causes unsynchronised electrical impulses that prevent the heart from pumping blood. On 13 December, a 150 MeV beam of protons was directed at a portion of tissue in the heart of a 73-year-old male patient at the National Center of Oncological Hadrontherapy (CNAO) in Italy – a facility set out 25 years ago by the TERA Foundation and rooted in accelerator technologies developed in conjunction with CERN via the Proton Ion Medical Machine Study (PIMMS). The successful procedure had a minimal impact on the delicate surrounding tissues, and marks a new path in the rapidly evolving field of hadron therapy.

The use of proton beams in radiation oncology, first proposed in 1946 by founding director of Fermilab Robert Wilson, allows a large dose to be depo­sited in a small and well-targeted volume, reducing damage to healthy tissue surrounding a tumour and thereby reducing side effects. Upwards of 170,000 cancer patients have benefitted from proton therapy at almost 100 centres worldwide, and demand continues to grow (CERN Courier January/February 2018 p32).

The choice by clinicians in Italy to use protons to treat a cardiac pathology was born out of necessity to fight an aggressive form of ventricular tachycardia that had not responded effectively to traditional treatments. The idea is that the Bragg peak typical of light charged ions (by which a beam can deposit a large amount of energy in a small region) can produce small scars in the heart tissues similar to the ones caused by the standard invasive technique of RF cardiac ablation. “To date, the use of heavy particles (protons, carbon ions) in this area has been documented in the international scientific literature only on animal models,” said Roberto Rordorf, head of arrhythmology at San Matteo Hospital, in a press release on 22 January. “The Pavia procedure appears to be the first in the world to be performed on humans and the first results are truly encouraging. For this reason, together with CNAO we are evaluating the feasibility of an experimental clinical study.”

Hadron therapy for all

CNAO is one of just six next-generation particle-therapy centres in the world capable of generating beams of protons and carbon ions, which are biologically more effective than protons in the treatment of radioresistant tumours. The PIMMS programme from which the accelerator design emerged, carried out at CERN from 1996 to 2000, aimed to design a synchrotron optimised for ion therapy (CERN Courier January/February 2018 p25). The first dual-ion treatment centre in Europe was the Heidelberg Ion-Beam Therapy Centre (HIT) in Germany, designed by GSI, which treated its first patient in 2009. CNAO followed in 2011 and then the Marburg Ion-Beam Therapy Centre in Germany (built by Siemens and operated by Heidelberg University Hospital since 2015). Finally, MedAustron in Austria, based on the PIMMS design, has been operational since 2016. Last year, CERN launched the Next Ion Medical Machine Study (NIMMS) as a continuation of PIMMS to carry out R&D into the superconducting magnets, linacs and gantries for advanced hadron therapy. NIMMS will also explore ways to reduce the cost and footprint of hadron therapy centres, allowing more people in different regions to benefit from the treatment (CERN Courier March 2017 p31).

I think that in 20 years’ time cardiac arrhythmias will be mostly treated with light-ion accelerators

“When I decided to leave the spokesmanship of the DELPHI collaboration to devote my time to cancer therapy with light-ion beams I could not imagine that, 30 years later, I would have witnessed the treatment of a ventricular tachycardia with a proton beam and, moreover, that this event would have taken place at CNAO, a facility that has its roots at CERN,” says TERA founder Ugo Amaldi. “The proton treatment recently announced, proposed to CNAO by cardiologists of the close-by San Matteo Hospital to save the life of a seriously ill patient, is a turning point. Since light-ion ablation is non-invasive and less expensive than the standard catheter ablation, I think that in 20 years’ time cardiac arrhythmias will be mostly treated with light-ion accelerators. For this reason, TERA has secured a patent on the use of ion linacs for heart treatments.”

The post Protons herald new cardiac treatment appeared first on CERN Courier.

]]>
News In a clinical world-first, a proton beam has been used to treat a patient with a ventricular tachycardia, which causes unsynchronised electrical impulses that prevent the heart from pumping blood. https://cerncourier.com/wp-content/uploads/2020/03/CCMarApr20_NewsAnalysis_CNAO2.jpg
Linacs pushed to the limit in Chamonix https://cerncourier.com/a/linacs-pushed-to-the-limit-in-chamonix/ Fri, 24 Jan 2020 13:30:48 +0000 https://preview-courier.web.cern.ch/?p=86358 Linac applications discussed at High Gradient 2019 ranged from CLIC to XFELs and medical accelerators.

The post Linacs pushed to the limit in Chamonix appeared first on CERN Courier.

]]>
This past June in Chamonix, CERN hosted the 12th edition of an international workshop dedicated to the development and application of high-gradient and high-frequency linac technology. These technologies are making accelerators more compact, less expensive and more efficient, and broadening their range of applications. The workshop brought together over seventy 70 and engineers involved in a wide range of accelerator applications, with common interest in the use and development of normal-conducting radio-frequency cavities with very high accelerating gradients ranging from around 50 MV/m to above 100 MV/m.

Applications for high-performance linacs such as these include the Compact Linear Collider (CLIC), compact XFELs and inverse-Compton-scattering photon sources, medical accelerators, and specialised devices such as radio-frequency quadrupoles, transverse deflectors and energy-spread linearisers. In recent years the latter two devices have become essential to achieving low emittances and short bunch lengths in high-performance electron linacs of many types, including superconducting linacs. In the coming years, developments from the high-gradient community will be increasing the energy of beams in existing facilities through retrofit programs, for example in an energy upgrade of the FERMI free-electron laser. In the medium term, a number of new high-gradient linacs are being proposed, such as the room-scale X-ray-source SMART*LIGHT, the linac for the advanced accelerator concept research accelerator EUPRAXIA, and a linac to inject electrons into CERN’s Super Proton Synchrotron for a dark-matter search. The workshop also covered fundamental studies of the very complex physical effects that limit the achievable high gradients, such as vacuum arcing, which is one of the main limitations for future technological advances.

Vacuum arcing is one of the main limitations for future technological advances

Originated by the CLIC study, the focus of the workshop series has grown to encompass high-gradient radio-frequency design, precision manufacture, assembly, power sources, high-power operation and prototype testing. It is also notable for having a strong industrial participation, and plays an important role in broadening the applications of linac technology by highlighting upcoming hardware to companies. The next workshop in the series will be hosted jointly by SLAC and Los Alamos and take place on the shore of Lake Tahoe from 8 to 12 June.

The post Linacs pushed to the limit in Chamonix appeared first on CERN Courier.

]]>
Meeting report Linac applications discussed at High Gradient 2019 ranged from CLIC to XFELs and medical accelerators. https://cerncourier.com/wp-content/uploads/2020/01/hg5.jpg
Kirkby bags aerosol award https://cerncourier.com/a/kirkby-bags-aerosol-award/ Fri, 10 Jan 2020 14:39:42 +0000 https://preview-courier.web.cern.ch/?p=86079 Kirkby originated the CLOUD experiment at CERN.

The post Kirkby bags aerosol award appeared first on CERN Courier.

]]>
Jasper Kirkby

The 2019 Benjamin Y H Liu Award of the American Association for Aerosol Research, which recognises outstanding contributions to aerosol instrumentation and experimental techniques, has been awarded to CERN’s Jasper Kirkby for his investigations into atmospheric new-particle and cloud formation using the unique CLOUD experiment at CERN, which he originated. The award committee described CLOUD as “arguably the most effective experiment to study atmospheric nucleation and growth ever designed and constructed, really by a country mile”, and said of Kirkby: “His irrepressible will and determination have adapted the culture of ‘big science’ at CERN to a major atmospheric science problem. Along the way, Jasper has also become a world-class aerosol scientist.”

The post Kirkby bags aerosol award appeared first on CERN Courier.

]]>
News Kirkby originated the CLOUD experiment at CERN. https://cerncourier.com/wp-content/uploads/2020/01/CCJanFeb20_AandA_Kirkby-feature.jpg
Building Gargantua https://cerncourier.com/a/building-gargantua/ Tue, 12 Nov 2019 16:55:07 +0000 https://preview-courier.web.cern.ch/?p=85333 Oliver James describes the visual effects which produced the black hole in Interstellar.

The post Building Gargantua appeared first on CERN Courier.

]]>
Oliver James is chief scientist of the world’s biggest visual effects studio, DNEG, which produced the spectacular visual effects for Interstellar. DNEG’s work, carried out in collaboration with theoretical cosmologist Kip Thorne, led to some of the most physically-accurate images of a spinning black hole ever created, earning the firm an Academy Award and a BAFTA. For James, it all began with an undergraduate degree in physics at the University of Oxford in the late 1980s – a period that he describes as one of the most fascinating and intellectually stimulating of his life. “It confronted me with the gap between what you observe and reality. I feel it was the same kind of gap I faced while working for Interstellar. I had to study a lot to understand the physics of black holes and curved space time.”

A great part of visual effects is understanding how light interacts with surfaces and volumes and eventually enters a camera’s lens and as a student, Oliver was interested in atomic physics, quantum mechanics and modern optics. This, in addition to his two other passions – computing and photography – led him to his first job in a small photographic studio in London where he became familiar with the technical and operational aspects of the industry. Missing the intellectual challenge offered by physics, in 1995 he contacted and secured a role in the R&D team of the Computer Film Company – a niche studio specialising in digital film which was part of the emerging London visual effects industry.

Suddenly these rag-dolls came to life and you’d find yourself wincing in sympathy as they were battered about

Oliver James

A defining moment came in 2001, when one of his ex-colleagues invited him to join Warner Bros’ ESC Entertainment at Alameda California to work on The Matrix Reloaded & Revolutions. His main task was to work on rigid-body simulations – not a trivial task given the many fight scenes. “There’s a big fight scene, called the Burly Brawl, where hundreds of digital actors get thrown around like skittles,” he says. “We wanted to add realism by simulating the physics of these colliding bodies. The initial tests looked physical, but lifeless, so we enhanced the simulation by introducing torque at every joint, calculated from examples of real locomotion. Suddenly these rag-dolls came to life and you’d find yourself wincing in sympathy as they were battered about”. The sequences took dozens of artists and technicians months of work to create just a few seconds of the movie.

DNEG chief scientist Oliver James

Following his work in ESC Entertainment, James moved back to London and, after a short period at the Moving Picture Company, he finally joined “Double Negative” in 2004 (renamed DNEG in 2018). He’d been attracted by Christopher Nolan’s film Batman Begins, for which the firm was creating visual effects, and it was the beginning of a long and creative journey that would culminate in the sci-fi epic Interstellar, which tells the story of an astronaut searching for habitable planets in outer space.

Physics brings the invisible to life
“We had to create a new imagery for black holes; a big challenge even for someone with a physics background,” recalls James. Given that he hadn’t studied general relativity as an undergraduate and had only touched upon special relativity, he decided to call Kip Thorne of Caltech for help. “At one point I asked [Kip] a very concrete question: ‘Could you give me an equation that describes the trajectory of light from a distant star, around the black hole and finally into an observer’s eye?’ This must have struck the right note as the next day I received an email—it was more like a scientific paper that included the equations answering my questions.” In total, James and Thorne exchanged some 1000 emails, often including detailed mathematical formalism that DNEG could then use in its code. “I often phrased my questions in a rather clumsy way and Kip insisted: “What precisely do you mean”? says James. “This forced me to rethink what was lying at the heart of my questions.”

The result for the wormhole was like a crystal ball reflecting each point the universe

Oliver James

DNEG was soon able to develop new rendering software to visualise black holes and wormholes. The director had wanted a wormhole with an adjustable shape and size and thus we designed one with three free parameters, namely the length and radius of the wormhole’s interior as well as a third variant describing the smoothness of the transition from its interior to its exteriors, explains James. “The result for the wormhole was like a crystal ball reflecting each point the universe; imagine a spherical hole in space–time.” Simulating a black hole represented a bigger challenge as, by definition, it is an object that doesn’t allow light to escape. With his colleagues, he developed a completely new renderer that simulates the path of light through gravitationally warped space–time – including gravitational lensing effects and other physical phenomena that take place around a black hole.

Quality standards
On the internet, one can find many images of black holes “eating” other stars of stars colliding to form a black hole. But producing an image for a motion picture requires totally different quality standards. The high quality demanded of an IMAX image meant that the team had to eliminate any artefacts that could show up in the final picture, and consequently rendering times were up to 100 hours compared to the typical 5–6 hours needed for other films. Contrary to the primary goal of most astrophysical visualisations to achieve a fast throughput, their major goal was to create images that looked like they might really have been filmed. “This goal led us to employ a different set of visualisation techniques from those of the astrophysics community—techniques based on propagation of ray bundles (light beams) instead of discrete light rays, and on carefully designed spatial filtering to smooth the overlaps of neighbouring beams,” says James.

Gravitationally-lensed accretion disks

DNEG’s team generated a flat, multicoloured ring standing for the accretion disk and positioned it surrounding the spinning black hole. The result was a warped spac–time around the black hole including its accretion disk. Thorne later wrote in his 2014 book The Science of Interstellar: “You cannot imagine how ecstatic I was when Oliver sent me his initial film clips. For the first time ever –and before any other scientist– I saw in ultra-high definition what a fast-spinning black hole looks like. What it does, visually, to its environment.” The following year, James and his DNEG colleagues published two papers with Thorne on the science and visualisation of these objects (Am. J. Phys 83 486 and Class. Quantum Grav. 32 065001).

Another challenge was to capture the fact that the film camera should be traveling at a substantial fraction of the speed of light. Relativistic aberration, Doppler shifts and gravitational redshifts had to be integrated in the rendering code, influencing how the disk layers would look close to the camera as well as the colour grading and brightness changes in the final image. Things get even more complicated closer to the black hole where space–time is more distorted; gravitational lensing gets more extreme and the computation takes more steps. Thorne developed procedures describing how to map a light ray and a ray bundle from the light source to the camera’s local sky, and produced low-quality images in Mathematica to verify his code before giving it to DNEG to create the fast and high-resolution render. This was used to simulate all the images to be lensed: fields of stars, dust clouds and nebulae and the accretion disk around the Gargantua, Interstellar’s gigantic black hole. In total, the movie notched up almost 800 TB of data. To simulate the starry background, DNEG used the Tycho-2 catalogue star catalogue from the European Space Agency containing about 2.5 million stars, and more recently the team has adopted the Gaia catalogue containing 1.7 billion stars.

Creative industry
With the increased use of visual effects, more and more scientists are working in the field including mathematicians and physicists. And visual effects are not vital only for sci-fi movies but are also integrated in drama or historical films. Furthermore, there are a growing number of companies creating tailored simulation packages for specific processes. DNEG alone has increased from 80 people in 2004 to more than 5000 people today. At the same time, this increase in numbers means that software needs to be scalable and adaptable to meet a wide range of skilled artists, James explains. “Developing specialised simulation software that gets used locally by a small group of skilled artists is one thing but making it usable by a wide range of artists across the globe calls for a much bigger effort – to make it robust and much more accessible”.

DNEG CERN Colloquium

Asked if computational resources are a limiting factor for the future of visual effects, James thinks any increase in computational power will quickly be swallowed up by artists adding extra detail or creating more complex simulations. The game-changer, he says, will be real-time simulation and rendering. Today, video games are rendered in real-time by the computer’s video card, whereas visual effects in movies are almost entirely created as batch-processes and afterwards the results are cached or pre-rendered so they can be played back in real-time. “Moving to real-time rendering means that the workflow will not rely on overnight renders and would allow artists many more iterations during production. We have only scratched the surface and there are plenty of opportunities for scientists”. Even machine learning promises to play a role in the industry, and James is currently involved in R&D to use it to enable more natural body movements or facial expressions. Open data and open access is also an area which is growing, and in which DNEG is actively involved.

“Visual effects is a fascinating industry where technology and hard-science are used to solve creative problems,” says James. “Occasionally the roles get reversed and our creativity can have a real impact on science.”

The post Building Gargantua appeared first on CERN Courier.

]]>
Feature Oliver James describes the visual effects which produced the black hole in Interstellar. https://cerncourier.com/wp-content/uploads/2019/11/Interstellar.jpg
Hadron therapy to get heavier in Southeast Europe https://cerncourier.com/a/hadron-therapy-to-get-heavier-in-southeast-europe/ Fri, 20 Sep 2019 13:49:45 +0000 https://preview-courier.web.cern.ch/?p=84629 Design phase begins for SEEIIST, a centre for internationally competitive research founded in the spirit of the CERN model.

The post Hadron therapy to get heavier in Southeast Europe appeared first on CERN Courier.

]]>
Montenegro prime minister Duško Marković marks the start of the SEEIIST design phase on 18 September.

A state-of-the-art facility for hadron therapy in Southeast Europe has moved from its conceptual to design phase, following financial support from the European Commission. At a kick-off meeting held on Wednesday 18 September in Budva, Montenegro, more than 120 people met to discuss the future South East European International Institute for Sustainable Technologies (SEEIIST) – a facility for tumour therapy and biomedical research that follows the founding principles of CERN.

“This is a region that has no dilemma regarding its European affiliation, and which, I believe, will be part of a joint European competition for technological progress. Therefore, the International Institute for Sustainable Technologies is an urgent need of our region,” said Montenegro prime minister Duško Marković during the opening address. “I am confident that the political support for this project is obvious and indisputable. The memorandum of understanding was signed by six prime ministers in July this year in Poznan. I believe that other countries in the region will formally join the initiative.”

The idea for SEEIIST germinated three years ago at a meeting of trustees of the World Academy of Art and Science in Dubrovnik, Croatia. It is the brainchild of former CERN Director-General Herwig Schopper, and has benefitted from a political push from Montenegro minister of science Sanja Damjanović, who is also a physicist who works at CERN and GSI-FAIR in Darmstadt, Germany. SEEIIST aims to create a platform for internationally competitive research in the spirit of the CERN model “science for peace”, stimulating the education of young scientists, building scientific capacity and fostering greater cooperation and mobility in the region.

SEEIIST event

In January 2018, at a forum at the International Centre for Theoretical Physics in Italy held under the auspices of UNESCO, the International Atomic Energy Agency and the European Physical Society, two possibilities for a large international institute were presented: a synchrotron X-ray facility and a hadron-therapy centre. Soon afterwards, the 10 participating parties of SEEIIST’s newly formed intergovernmental steering committee chose the latter.

Europe has played a major role in the development of hadron therapy, with numerous centres currently offering proton therapy and four facilities offering proton and more advanced carbon-ion treatment. But currently no such facility exists in Southeast Europe despite a growing number of tumours being diagnosed there. SEEIIST will follow the  idea of the “PIMMS” accelerator design started at CERN two decades ago, profiting from the experience at the dual proton–ion centres CNAO in Italy and MedAustron in Austria, and also centres at GSI and in Heidelberg. It will be a unique facility that splits its beam time 50:50 between treating patients and performing research with a wide range of different ions for radiobiology, imaging and treatment planning. The latter will include studies into the feasibility of heavier ions such as oxygen, making SEEIIST distinct in this rapidly growing field.

The next steps are to prepare a definite technical design for the facility, to propose a structure and business plan and to define the conditions for the site selection. To carry out these tasks, several working groups are being established in close collaboration with CERN and GSI-FAIR. “This great event was a culmination of the continuous efforts invested since 2017 into the project,” says Damjanović. “If all goes well, construction is expected to start in 2023, with first patient treatment in 2028.”

The post Hadron therapy to get heavier in Southeast Europe appeared first on CERN Courier.

]]>
News Design phase begins for SEEIIST, a centre for internationally competitive research founded in the spirit of the CERN model. https://cerncourier.com/wp-content/uploads/2019/09/SEEIISTopening-1024x683_resize.jpg
CERN and ESA join forces in harsh environments https://cerncourier.com/a/cern-and-esa-join-forces-in-harsh-environments/ Thu, 12 Sep 2019 08:44:26 +0000 https://preview-courier.web.cern.ch/?p=84309 CERN has signed a collaboration agreement with the European Space Agency to address the challenges of harsh radiation environments.

The post CERN and ESA join forces in harsh environments appeared first on CERN Courier.

]]>
The effects of radiation on electronics for the JUICE mission

Strengthening connections between particle physics and related disciplines, CERN signed a collaboration agreement with the European Space Agency (ESA) on 11 July to address the challenges of operating equipment in harsh radiation environments. Such environments are found in both particle-physics facilities and outer space, and the agreement identifies several high-priority projects, including: high-energy electron tests; high-penetration heavy-ion tests; assessment of commercial components and modules; radiation-hard and radiation-tolerant components and modules; radiation detectors, monitors and dosimeters; and simulation tools for radiation effects. Important preliminary results have already been achieved in some areas, including high-energy electron tests of electronics for the Jupiter Icy Moons Explorer (JUICE) mission performed at CERN’s CLEAR/VESPER facility.

The post CERN and ESA join forces in harsh environments appeared first on CERN Courier.

]]>
News CERN has signed a collaboration agreement with the European Space Agency to address the challenges of harsh radiation environments. https://cerncourier.com/wp-content/uploads/2019/09/CCSepOct19_news_europa.jpg
The cutting edge of cancer research https://cerncourier.com/a/the-cutting-edge-of-cancer-research/ Mon, 09 Sep 2019 13:23:13 +0000 https://preview-courier.web.cern.ch/?p=84365 In recent years, physicists have contributed insights into the interplay of phenomena at different scales.

The post The cutting edge of cancer research appeared first on CERN Courier.

]]>
Breast cancer cells

Cancer is a heterogeneous phenomenon that is best viewed as a complex system of cells interacting in a changing micro-environment. Individual experiments may fail to capture this reality, given spatially and temporally limited scales of observation, however, in recent years, physicists have contributed insights into the interplay of phenomena at different scales: gene regulatory networks and communities of cells or organisms are two examples of systems whose properties emerge from the behaviour of individual components. Unfortunately, however, such research is usually confined to journals and specialised conferences, hindering the entry of interested physicists into the field. The publication of a new interdisciplinary textbook is therefore most welcome.

La Porta and Zapperi’s The Physics of Cancer, one of the few books devoted to this subject, brings 15 years of exciting and important results in cancer research to a wide audience. The book approaches the subject from the perspective of physics, chemistry, mathematics and computer science. As a result of the vastness of the subject and the brevity of the book, the discussion can occasionally feel superficial, but the main concepts are introduced in a manner accessible to physicists. The authors follow a logical thread within each argument, and furnish the reader with abundant references to the original literature.

The book begins by observing that the “hallmarks” of cancer are not only yet to be understood, but have increased in number. Published at the turn of the millennium, Douglas Hanahan and Robert Weinberg’s seminal paper identified six: sustaining proliferative signalling; evading growth suppressors; enabling replicative immortality; activating invasion and metastasis; inducing angiogenesis; and resisting cell death. Just 11 years later the same authors published an updated review adding four more hallmarks: avoiding immune destruction; promoting inflammation; genome instability and mutation; and deregulating cellular energetics. The amount of research that has been distilled into a handful of concepts is formidable. However, La Porta and Zapperi argue that a more abstract and unifying approach is now needed to gain a deeper understanding. They advocate studying cancer as a complex system with the tools of several disciplines, in particular subfields of physics such as biomechanics, soft-condensed-matter physics and statistical mechanics.

The book is structured in 10 self-contained chapters. The first two present essential notions of cell and cancer biology. The subsequent chapters deal with different features of cancer from an interdisciplinary perspective. A discussion on statistics and computational models of cancer growth is followed by a chapter exploring the generation of vascular networks in its biological, hydrodynamical and statistical aspects. Next comes a mathematical discussion of tumour growth by stem cells – the active and self-differentiating cells thought to drive the growth of cancers. A couple of chapters treat the biomechanics of cancer cells and their migration in the body, before La Porta and Zapperi turn to the dynamics of chromosomes and the origin of the genetic mutations that cause cancer. The final two chapters focus on how to fight tumours, from the perspectives of both the immune system and pharmacological agents.

La Porta and Zapperi’s book isn’t just light reading for curious physicists – it can also serve to guide interested researchers into a rich interdisciplinary area.

The post The cutting edge of cancer research appeared first on CERN Courier.

]]>
Review In recent years, physicists have contributed insights into the interplay of phenomena at different scales. https://cerncourier.com/wp-content/uploads/2019/09/CCSepOct19_Rev-cancer.jpg
Austrian synchrotron debuts carbon-ion cancer treatment https://cerncourier.com/a/austrian-synchrotron-debuts-carbon-ion-cancer-treatment/ Mon, 05 Aug 2019 15:43:54 +0000 https://preview-courier.web.cern.ch/?p=83969 MedAustron becomes one of six centres worldwide to treat tumours with carbon ions.

The post Austrian synchrotron debuts carbon-ion cancer treatment appeared first on CERN Courier.

]]>
The ion-beam injectors of the MedAustron facility in Austria. Credit: MedAustron/T Kästenbauer

MedAustron, an advanced hadron-therapy centre in Austria, has treated its first patient with carbon ions. The medical milestone, which took place on 2 July 2019, elevates the particle-physics-linked facility to the ranks of only six centres worldwide that can combat tumours with both protons and carbon ions.

When protons and carbon ions strike biological material, they lose energy much more quickly than photons, which are traditionally used in radiotherapy. This makes it possible to deposit a large dose in a small and well-targeted volume, reducing damage to healthy tissue surrounding a tumour and thereby reducing the risk of side effects. While proton therapy has been successfully used at MedAustron since December 2016, treating more than 400 cancer patients so far, carbon-ion therapy opens up new opportunities to target tumours that were previously difficult or impossible to treat. Carbon ions are biologically more effective than protons and therefore allow a higher dose to be administered to the tumour.

MedAustron’s accelerator complex is based on the CERN-led Proton Ion Medical Machine Study, the design subsequently developed by CERN, the TERA Foundation, INFN in Italy and the CNAO Foundation (see “Therapeutic particles”). Substantial help was also provided by the Paul Scherrer Institute, in particular for the gantry and beam-delivery designs. The MedAustron system comprises an injector, where ions from three ion sources are pre-accelerated by a linear accelerator, a synchrotron, a high-energy beam transport system to deliver the beam to various beam ports, and a medical front-end, which controls the irradiation process and covers all safety aspects with respect to the patient. Certified as a medical product, the accelerator provides proton and carbon ion beams with a penetration depth of about up to 37 cm in water-equivalent tissue, and is able to deliver carbon-ions with 255 different energies ranging from 120 to 400 MeV with maximum intensities of up to 109 ions per extracted beam pulse.

The MedAustron proton/carbon-ion synchrotron

“The first successful carbon-ion treatment unveils MedAustron’s full potential for cancer treatment,” says Michael Benedikt of CERN, who co-ordinated the laboratory’s contributions to the project. “The realisation of MedAustron, through the collaboration with CERN for the construction of the accelerator facility, is an excellent example of large-scale technology transfer from fundamental research to societal applications.”

Particle therapy with carbon ions was first used in Japan in 1994, and a total of almost 30,000 patients worldwide have since been treated with this method. Initially, treatment with carbon ions at MedAustron will focus on tumours in the head and neck region, and at the base of the skull. But the spectrum will be continuously expanded to include other tumour types. MedAustron is also working on the completion of an additional treatment room with a gantry that administers proton beams from a large variety of irradiation angles.

“Irradiation with carbon ions makes it possible to maintain both the physical functions and the quality of life of patients, even with very complicated tumours,” says Piero Fossati, scientific and clinical director of MedAustron’s carbon ion programme.

The post Austrian synchrotron debuts carbon-ion cancer treatment appeared first on CERN Courier.

]]>
News MedAustron becomes one of six centres worldwide to treat tumours with carbon ions. https://cerncourier.com/wp-content/uploads/2019/08/MedAustron-ion-sourcesHIRES.jpg
FuSuMaTech initiative levels up https://cerncourier.com/a/fusumatech-initiative-levels-up/ Thu, 11 Jul 2019 08:28:23 +0000 https://preview-courier.web.cern.ch?p=83615 Projects include new designs for MRI gradient coils, the design of 14 and 16 T MRI magnets, and a conceptual design for new mammography magnets.

The post FuSuMaTech initiative levels up appeared first on CERN Courier.

]]>

On 1 April more than 90 delegates gathered at CERN to discuss perspectives on superconducting magnet technology. The workshop marked the completion of phase 1 of the Future Superconducting Magnet Technology (FuSuMaTech) Initiative, launched in October 2017.

FuSuMaTech is a Horizon 2020 Future Emerging Technologies project co-funded by the European Commission, with the support of industrial partners ASG, Oxford Instruments, TESLA, SIGMAPHI, ELLYT Energy and BILFINGER, and academia partners CERN, CEA, STFC, KIT, PSI and CNRS. It aims to strengthen the field of superconductivity for projects such as the High-Luminosity LHC and  Future Circular Collider, while demonstrating the benefits of this investment to society at large.

“The need to develop higher performing magnets for future accelerators is certain, and cooperation will be essential,” said Han Dols of CERN’s knowledge transfer group. “The workshop helps reiterate common areas of interest between academia and industry, and how they might benefit from each other’s know-how. And just as importantly,” continued Dols, “FuSuMaTech is seeking to demonstrate the benefits of this investment by setting up demonstrator projects.”

The successful preparation of 10 project proposals for both R&D actions and industrial applications is one of the main achievements of FuSuMaTech Phase-1, noted project coordinator Antoine Dael. These projects include new designs for MRI gradient coils, the design of 14 and 16 T MRI magnets, and a conceptual design for new mammography magnets. New developments are also included in the proposals, with the design for a hybrid low–high temperature superconductor magnet, an e-infrastructure to collect material properties and a pulsed-heat-pipe cooling system.

In phase 2 of FuSuMaTech, launched with the signing of a declaration of intention between the FuSuMaTech partners on April 1, the 10 project proposals prepared during phase 1 will evolve into independent projects and make use of other European Union programmes. “We were really impressed with the interest we got from organisations outside of the project,” said Dael. “We currently have six industrial partners, two more have already contacted us today, and we expect others.”

The post FuSuMaTech initiative levels up appeared first on CERN Courier.

]]>
Meeting report Projects include new designs for MRI gradient coils, the design of 14 and 16 T MRI magnets, and a conceptual design for new mammography magnets. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_FN-fusumatech-1.jpg
Deciphering elementary particles https://cerncourier.com/a/deciphering-elementary-particles/ Fri, 22 Mar 2019 14:01:04 +0000 https://preview-courier.web.cern.ch?p=13671 Former CERN physicist Christian Fabjan takes a whirlwind tour of 60 years of innovation in particle-detection technology at CERN and beyond.

The post Deciphering elementary particles appeared first on CERN Courier.

]]>
Particle physics began more than a century ago with the discoveries of radioactivity, the electron and cosmic rays. Photographic plates, gas-filled counters and scintillating substances were the early tools of the trade. Studying cloud formation in moist air led to the invention of the cloud chamber, which, in 1932, enabled the discovery of the positron. The photographic plate soon morphed into nuclear-emulsion stacks, and the Geiger tube of the Geiger–Marsden–Rutherford experiments developed into the workhorse for cosmic-ray studies. The bubble chamber, invented in 1952, represented the culmination of these “imaging detectors”, using film as the recording medium. Meanwhile, in the 1940s, the advent of photomultipliers had opened the way to crystal-based photon and electron energy measurements and Cherenkov detectors. This was the toolbox of the first half of the 20th century, credited with a number of groundbreaking discoveries that earned the toolmakers and their artisans more than 10 Nobel Prizes.

extraction of the ALICE time projection chamber

Game changer

The invention of the Multi Wire Proportional Chamber (MWPC) by Georges Charpak in 1968 was a game changer, earning him the 1992 Nobel Prize in Physics. Suddenly, experimenters had access to large-area charged particle detectors with millimetre spatial resolution and staggering MHz-rate capability. Crucially, the emerging integrated-circuit technology could deliver amplifiers so small in size and cost to equip many thousands of proportional wires. This ingenious and deceptively simple detector is relatively easy to construct. The workshops of many university physics departments could master the technology, attracting students and “democratising” particle physics. So compelling was experimentation with MWPCs that within a few years, large detector facilities with tens of thousands of wires were constructed – witness the Split Field Magnet at CERN’s Intersecting Storage Rings (ISR). Its rise to prominence was unstoppable: it became the detector of choice for the Proton Synchrotron, Super Proton Synchrotron (SPS) and ISR programmes. An extension of this technique is the drift chamber, a MWPC-type geometry, with which the time difference between the passage of the particle and the onset of the wire signal is recorded, providing a measure of position with 100 µm-level resolution. The MWPC concept lends itself to a multitude of geometries and has found its “purest” application as the readout of time projection chambers (TPCs). Modern derivatives replace the wire planes with metallised foils with holes in a sub-millimetre pattern, amplifying the ionisation signals.

The ambition, style and success of these large, global collaborations was contagious

The ISR was a hotbed for accelerator and detector inventions. The world’s first proton–proton collider, an audacious project, was clearly ahead of its time and the initial experiments could not fully exploit its discovery potential. It prompted, however, the concept of multi-purpose facilities capable of obtaining “complete” collision information. For the first time, a group developed and used transition-radiation detectors for electron detection and liquid-argon calorimetry. The ISR’s Axial Field Spectrometer (AFS) provided high-quality hadron calorimetry with close to 4π coverage. These technologies are now widely used at accelerators and for non-accelerator experiments. The stringent performance requirements for experiments at the ISR encouraged the detector developers to explore and reach a measurement quality only limited by the laws of detector physics: science-based procedures had replaced the “black magic” of detector construction. With collision rates in the 10 MHz range, these experiments (and the ISR) were forerunners of today’s Large Hadron Collider (LHC) experiments. Of course, the ISR is most famous for its seminal accelerator developments, in particular the invention of stochastic cooling, which was the enabling technology for converting the SPS into a proton–antiproton collider.

The SPS marked another moment of glory for CERN. In 1976 first beams were accelerated to 400 GeV, initiating a diverse physics programme and motivating a host of detector developments. Advances in semiconductor technology led to the silicon-strip detector. With the experiments barely started, Carlo Rubbia and collaborators launched the idea, as ingenious as it was audacious, to convert the SPS into a proton–antiproton collider. The goal was clear: orchestrate quickly and rather cheaply a machine with enough collision energy to produce the putative W and Z bosons. Simon van der Meer’s stochastic-cooling scheme had to deliver the required beam intensity and lifetime, and two experimental teams were charged with the conception and construction of the equally novel detectors. The centrepiece of the UA1 detector was a 6 m-long and 2 m-diameter “electronic bubble chamber”, which adapted the drift-chamber concept to the event topology and collision rate, combined with state-of-the-art electronic readout. The electronic images were of such illuminating quality that “event scanning”, the venerable bubble- chamber technique, was again a key tool in data analysis. The UA2 team pushed calorimetry and silicon detectors to new levels of performance, provided healthy competition and independent discoveries. The discovery of the W and Z bosons was achieved in 1983 and, the following year, Rubbia and van der Meer became Nobel Laureates.

Laying foundations

In 1981, with the approval of the Large Electron Positron (LEP) collider, the community laid the foundation for decades of research at CERN. Mastering the new scale of the accelerator dimension also brought a new approach to managing the larger experimental collaborations and to meeting their more stringent experimental requirements. For the first time, mostly outside collaborators developed and built the experimental apparatus, a non-trivial, but needed success in technology transfer. The detection techniques reached a new state of matureness. Silicon-strip detectors became ubiquitous. Gaseous tracking in a variety of forms, such as TPCs and jet chambers, reached new levels of size and performance. There were also some notable firsts. The DELPHI collaboration developed the Ring Imaging Cherenkov Counter, a delicate technology in which the distribution of Cherenkov photons, imaged with mirrors onto photon-sensitive MWPC-type detectors, provides a measure of the particle’s velocity. The L3 collaboration aimed at ultimate-precision energy measurements of muons, photons and electrons, and put its money on a recently discovered scintillating crystal, bismuth germanate. Particle physicists, material scientists and crystallographers from academia and industry transformed this laboratory curiosity into mass-producible technology: ultimately, 12,000 crystals were grown, cut to size as truncated pyramids and assembled into the calorimeter, a pioneering trendsetter.

the multi-wire proportional chamber

The ambition, style and success of these large, global collaborations was contagious. It gave the cosmic-ray community a new lease of life. The Pierre Auger Observatory, one of whose initiators was particle physicist and Nobel Laureate James Cronin, explores cosmic rays at extreme energies with close to 2000 detector stations spread over an area of 3000 km2. The IceCube collaboration has instrumented around a cubic kilometre of Antarctic ice to detect neutrinos. One of the most ambitious experiments is the Alpha Magnetic Spectrometer, hosted by the International Space Station – again with a particle physicist and Nobel Prize winner, Samuel Ting, as a prime mover and shaker.

These decade-long efforts in experimentation find their present culmination at the LHC. Experimenters had to innovate on several fronts: all detector systems were designed for and had to achieve ultimate performance, limited only by the laws of physics; the detectors must operate at a GHz or more collision rate, generating some 100 billion particles per second. “Impossible” was many an expert’s verdict in the early 1990s. The successful collaboration with industry giants in the IT and electronics sectors was a life-saver; and achieving all this – fraught with difficulties, technical and sociological – in international collaborations of several thousand scientists and engineers was an immense achievement. All existing detection technologies – ranging from silicon-tracking, to transition-radiation and RICH detectors, liquid-argon, scintillator and crystal calorimeters to 10,000 m3-scale muon spectrometers – needed novel ideas, major improvements and daring extrapolations. The success of the LHC experiments is beyond the wildest dreams: hundreds of measurements achieve a precision, previously considered only possible at electron–positron colliders. The Higgs boson, discovered in 2012, will be part of the research agenda for most of the 21st century, and CERN is in the starting block with ambitious plans.

Sharing with society

Worldwide, more than 30,000 accelerators are in operation. Particle and nuclear physics research uses barely more than 100 of them. Society is the principal client, and many of the accelerator innovations and particle detectors have found their way into industry, biology and health applications. A class of accelerators, to which CERN has contributed significantly, is specifically dedicated to tumour therapy. Particle detectors have made a particular impact on medical imaging, such as positron emission tomography (PET), whose origin dates back to CERN with a MWPC-based detector in the 1970s. Today’s clinical PETs use crystals, very similar to those used in the discovery of the Higgs boson.

Possibly the most important benefit of particle physics to society is the collaborative approach developed by the community, which underpins the incredible success that has led us to the LHC experiments today. There are no signs that the rate of innovation in detectors and instrumentation is slowing. Currently the LHC experiments are undergoing major upgrades and plans for the next generation of experiments and colliders are already well under way. These collaborations succeed in being united and driven by a common goal, bridging cultural and political divides. 

The post Deciphering elementary particles appeared first on CERN Courier.

]]>
Feature Former CERN physicist Christian Fabjan takes a whirlwind tour of 60 years of innovation in particle-detection technology at CERN and beyond. https://cerncourier.com/wp-content/uploads/2019/03/CCSupp_1_Detec_Foreword-1.png
First light for supersymmetry https://cerncourier.com/a/first-light-for-supersymmetry/ Fri, 08 Mar 2019 16:22:44 +0000 https://preview-courier.web.cern.ch?p=13629 Ideas from supersymmetry have been used to address a longstanding challenge in optics – how to suppress unwanted spatial modes that limit the beam quality of high-power lasers.

The post First light for supersymmetry appeared first on CERN Courier.

]]>
schematic representation of a supersymmetric laser array

Ideas from supersymmetry have been used to address a longstanding challenge in optics – how to suppress unwanted spatial modes that limit the beam quality of high-power lasers. Mercedeh Khajavikhan at the University of Central Florida in the US and colleagues have created a first supersymmetric laser array, paving the way towards new schemes for scaling up the radiance of integrated semiconductor lasers.

Supersymmetry (SUSY) is a possible additional symmetry of space–time that would enable bosonic and fermionic degrees of freedom to be “rotated” between one another. Devised in the 1970s in the context of particle physics, it suggests the existence of a mirror-world of supersymmetric particles and promises a unified description of all fundamental interactions. “Even though the full ramification of SUSY in high-energy physics is still a matter of debate that awaits experimental validation, supersymmetric techniques have already found their way into low-energy physics, condensed matter, statistical mechanics, nonlinear dynamics and soliton theory as well as in stochastic processes and BCS-type theories, to mention a few,” write Khajavikhan and collaborators in Science.

The team applied the SUSY formalism first proposed by Ed Witten of the Institute for Advanced Study in Princeton to force a semiconductor laser array to operate exclusively in its fundamental transverse mode. In contrast to previous schemes developed to achieve this, such as common antenna-feedback methods, SUSY introduces a global and systematic method that applies to any type of integrated laser array, explains Khajavikhan. “Now that the proof of concept has been demonstrated, we are poised to develop high-power electrically pumped laser arrays based on a SUSY design. This can be applicable to various wavelengths, ranging from visible to mid-infrared lasers.”

To demonstrate the concept, the Florida-based team paired the unwanted modes of the main laser array (comprising five coupled ridge-waveguide cavities etched from quantum wells on an InP wafer) with a lossy superpartner (an array of four waveguides left unpumped). Optical strategies were used to build a superpartner index profile with propagation constants matching those of the four higher-order modes associated with the main array, and the performance of the SUSY laser was assessed using a custom-made optical setup. The results indicated that the existence of an unbroken SUSY phase (in conjunction with a judicious pumping of the laser array) can promote the in-phase fundamental mode and produce high-radiance emission.

“This is a remarkable example of how a fundamental idea such as SUSY may have a practical application, here increasing the power of lasers,” says SUSY pioneer John Ellis of King’s College London. “The discovery of fundamental SUSY still eludes us, but SUSY engineering has now arrived.”

The post First light for supersymmetry appeared first on CERN Courier.

]]>
News Ideas from supersymmetry have been used to address a longstanding challenge in optics – how to suppress unwanted spatial modes that limit the beam quality of high-power lasers. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_News-susy.jpg
Cosmic research poles apart https://cerncourier.com/a/cosmic-research-poles-apart/ Fri, 30 Nov 2018 09:00:03 +0000 https://preview-courier.web.cern.ch/?p=12965 Two independent groups are going to Earth’s extremes to make unprecedented measurements for physics, education and the environment.

The post Cosmic research poles apart appeared first on CERN Courier.

]]>

Every second, each square metre of the Earth is struck by thousands of charged particles travelling from deep space. It is now more than a century since cosmic rays were discovered, yet still they present major challenges to physics. The origin of high-energy cosmic rays is the biggest mystery, their energy too high to have been generated by astrophysical sources such as supernovae, pulsars or even black holes. But cosmic rays are also of interest beyond astrophysics. Recent studies at CERN’s CLOUD experiment, for example, suggest that cosmic rays may influence cloud cover through the formation of new aerosols, with important implications for the evolution of Earth’s climate.

This year, two independent missions were mounted in the Arctic and in Antarctica – Polarquest2018 and Clean2Antarctica – to understand more about the physics of high-energy cosmic rays. Both projects have a strong educational and environmental dimension, and are among the first to measure cosmic rays at such high latitudes.

Geomagnetic focus

Due to the shape of the geomagnetic field, the intensity of the charged cosmic radiation is higher at the poles than it is in equatorial regions. At the end of the 1920s it was commonly believed that cosmic rays were high-energy neutral particles (i.e. gamma rays), implying that the Earth’s magnetic field would not affect cosmic-ray intensity. However, early observations of the dependence of the cosmic-ray intensity on latitude rejected this hypothesis, showing that cosmic rays mainly consist of charged particles and leading to the first quantitative calculations of their composition.

The interest in measuring the cosmic-ray flux close to the poles is related to the fact that the geomagnetic field shields the Earth from low-energy charged cosmic rays, with an energy threshold (geomagnetic cut-off) depending on latitude, explains Mario Nicola Mazziotta, an INFN researcher and member of the Polarquest2018 team. “Although the geomagnetic cut-off decreases with increasing latitude, the cosmic-ray intensity at Earth reaches its maximum at latitudes of about 50–60°, where the cut-off is of a few GeV or less, and then seems not to grow anymore with latitude. This indicates that cosmic-ray intensity below a given energy is suppressed, due to solar effects, and makes the study of cosmic rays near the polar regions a very useful probe of solar activity.”

Polarquest2018 is a small cosmic-ray experiment that recently completed a six-week-long expedition to the Arctic Circle, on board a 18 m-long boat called Nanuq designed for sailing in extreme regions. The boat set out from Isafjordur, in North-East Iceland, on 22 July, circumnavigating the Svalbard archipelago in August and arriving in Tromsø on 4 September. The Polarquest2018 detectors reached 82 degrees north, shedding light on the soft component of cosmic rays trapped at the poles by Earth’s magnetic field.

Polarquest2018 is the result of the hard work of a team of a dozen people for more than a year, in addition to enthusiastic support from many other collaborators. Built at CERN by school students from Switzerland, Italy and Norway, Polarquest2018 encompasses three scintillator detectors to measure the cosmic-ray flux at different latitudes: one mounted on the Nanuq’s deck and two others installed in schools in Italy and Norway. The detectors had to operate with the limited electric power (12 W) that was available on board, both recording impinging cosmic rays and receiving GPS signals to timestamp each event with a precision of a few tens of nanoseconds. The detectors also had to be mechanically robust to resist the stresses from rough seas.

The three Polarquest2018 detectors join a network of around 60 others in Italy called the Extreme Energy Events – Science Inside Schools (EEE) experiment, proposed by Antonino Zichichi in 2004 and presently co-ordinated by the Italian research institute Centro Fermi in Rome, with collaborators including CERN, INFN and various universities. The detectors (each made of three multigap resistive plate chambers of about 2 m2 area) were built at CERN by high-school students and the large area of the EEE enables searches for very-long-distance correlations between cosmic-ray showers.

A pivotal moment in the arctic expedition came when the Nanuq arrived close to the south coast of the Svalbard archipelago and was sailing in the uncharted waters of the Recherche Fjord. While the crew admired a large school of belugas, the boat struck the shallow seabed, damaging its right dagger board and leaving the craft perched at a 45° incline. The crew fought to get the Nanuq free, but in the end had to wait almost 12 hours for the tide to rise again. Amazingly, explains Polarquest2018 project leader Paola Catapano of CERN, the incident had its advantages. “It allowed the team to check the algorithms used to correct the raw data on cosmic rays for the inclination and rolling of the boat, since the data clearly showed a decrease in the number of muons due to a reduced acceptance.”

Analysis of the Polarquest2018 data will take a few months, but preliminary results show no significant increase in the cosmic-ray flux, even at high latitudes. This is contrary to what one could naively expect considering the high density of the Earth’s magnetic field lines close to the pole, explains Luisa Cifarelli, president of Centro Fermi in Rome. “The lack of increase in the cosmic flux confirms the hypothesis formulated by Lemaître in 1932, with much stronger experimental evidence than was available up to now, and with data collected at latitudes where no published results exist,” she says. The Polarquest2018 detector has also since embarked on a road trip to measure cosmic rays all along the Italian peninsula, collecting data over a huge latitude interval.

Heading south

Meanwhile, 20,000 km south, a Dutch expedition to the South Pole called Clean2Antarctica has just got under way, carrying a small cosmic-ray experiment from Nikhef on board a vehicle called Solar Voyager. The solar-powered cart, built from recycled 3D-printed household plastics, will make the first ground measurements in Antarctica of the muon decay rate and of charged particles from extensive-air cosmic-ray showers. Cosmic rays will be measured by a roof-mounted scintillation device as the cart makes a 1200 km, six-week-long journey from the edge of the Antarctic icefields to the geometric South Pole.

The team taking the equipment across the Antarctic to the South Pole comprises mechanical engineer Ter Velde and his wife Liesbeth, who initiated the Clean2Antarctica project and are both active ocean sailors. Back in the warmer climes of the Netherlands, researchers from Nikhef will remotely monitor for any gradients in the incoming particle fluxes as the magnetic field lines are converging closer to the pole. In theory, the magnetic field will funnel charged particles from the high atmosphere to the Earth’s surface, leading to higher fluxes near the pole. But the incoming muon signal should not be affected, as this is produced by high-energy particles producing air showers of charged particles, explains Nikhef project scientist Bob van Eijk. “But this is experimental physics and a first, so we will just do the measurements and see what comes out,” he says.

The scintillation panel used is adapted from the HiSPARC rooftop cosmic-ray detectors that Nikhef has been providing in high schools in the Netherlands, the UK and Denmark for the past 15 years. Under professional supervision, students and teachers build these roof-box-sized detectors themselves and run the detection programme and data-analysis in their science classes. Some 140 rooftop stations are online and many thousands of pupils have been involved over the years, stimulating interest in science and research.

Pristine backdrop

The panel being taken to Antarctica is a doubled-up version that is half the usual area of the HiSPARC panels due to strict space restrictions. Two gyroscope systems will correct for any changes in the level of the panel while traversing the Antarctic landscape. All the instruments are solar powered, with the power coming from photovoltaic panels on two additional carts pulled by the main electric vehicle. The double detection depth of the panels will allow for muon-decay detection by photomultiplier tubes as well as regular cosmic-ray particles such as electrons and photons. Data from the experiment will be relayed regularly by satellite from the Solar Voyager vehicle so that analysis can take place in parallel, and will be made public through a dedicated website.

The Clean2Antarctic expedition set off in mid-November from Union Glacier Camp station near the Antarctic Peninsula. It is sponsored by Dutch companies and from crowd funding, and has benefitted from extensive press and television coverage. The trip will take the team across bleak snow planes and altitudes up to 2835 m and, despite being the height of Antarctic summer, temperatures could be down to –30 °C. The mission aims to use the pristine backdrop of Antarctica to raise public awareness about waste reduction and recycling.

“This is one of the rare occasions that a scientific outreach programme, with genuine scientific questions targeting high-school students as prime investigators, teams up with an idealist group that tries to raise awareness on environmental issues regarding circular economy,” says van Eijk. “The plastic for the vehicles was collected by primary-school kids, while three groups of young researchers formed ‘think tanks’ to generate solutions to questions about environmental issues that industrial sponsors/partners have raised.” Polarquest2018 had a similar goal, and its MantaNet project became the first to assess the presence and distribution of microplastics in the Arctic waters north of Svalbard at a record latitude of 82.7° north. According to MantaNet project leader Stefano Alliani: “One of the conclusions already drawn by sheer observation is that even at such high latitudes the quantity of macro plastic loitering in the most remote and wildest beaches of our planet is astonishing.”

The post Cosmic research poles apart appeared first on CERN Courier.

]]>
Feature Two independent groups are going to Earth’s extremes to make unprecedented measurements for physics, education and the environment. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Polar_frontis.png
J-PET’s plastic revolution https://cerncourier.com/a/j-pets-plastic-revolution/ Mon, 29 Oct 2018 09:00:03 +0000 https://preview-courier.web.cern.ch/?p=12858 A PET detector based on plastic scintillators offers whole-body imaging in addition to precision tests of fundamental symmetries.

The post J-PET’s plastic revolution appeared first on CERN Courier.

]]>
The J-PET detector

It is some 60 years since the conception of positron emission tomography (PET), which revolutionised the imaging of physiological and biochemical processes. Today, PET scanners are used around the world, in particular providing quantitative and 3D images for early-stage cancer detection and for maximising the effectiveness of radiation therapies. Some of the first PET images were recorded at CERN in the late 1970s, when physicists Alan Jeavons and David Townsend used the technique to image a mouse. While the principle of PET already existed, the detectors and algorithms developed at CERN made a major contribution to its development. Techniques from high-energy physics could now be about to enable another leap in PET technology.

In a typical PET scan, a patient is administered with a radioactive solution that concentrates in malignant cancers. Positrons from β+ decay annihilate with electrons from the body, resulting in the back-to-back emission of two 511 keV gamma rays that are registered in a crystal via the photoelectric effect. These signals are then used to reconstruct an image. Significant advances in PET imaging have taken place in the past few decades, and the vast majority of existing scanners use inorganic crystals – usually bismuth germanium oxide (BGO) or lutetium yttrium orthosilicate (LYSO) – organised in a ring to detect the emitted PET photons.

The main advantage of crystal detectors is their large stopping power, high probability of photoelectric conversion and good energy resolution. However, the use of inorganic crystals is expensive, limiting the number of medical facilities equipped with PET scanners. Moreover, conventional detectors are limited in their axial field of view: currently a distance of only about 20 cm along the body can be simultaneously examined from a single-bed position, meaning that several overlapping bed positions are needed to carry out a whole-body scan, and only 1% of quanta emitted from a patient’s body are collected. Extension of the scanned region from around 20 to 200 cm would not only improve the sensitivity and signal-to-noise ratio, but also reduce the radiation dose needed for a whole-body scan.

To address this challenge, several different designs for whole-body scanners have been introduced based on resistive-plate chambers, straw tubes and alternative crystal scintillators. In 2009, particle physicist Paweł Moskal of Jagiellonian University in Kraków, Poland, introduced a system that uses inexpensive plastic scintillators instead of inorganic ones for detecting photons in PET systems. Called the Jagiellonian PET (J-PET) detector, and based on technologies already employed in the ATLAS, LHCb, KLOE, COSY-11 and other particle-physics experiments, the aim is to allow cost effective whole-body PET imaging.

Whole-body imaging

The current J-PET setup comprises a ring of 192 detection modules axially arranged in three layers as a barrel-shaped detector and the construction is based on 17 patent-protected solutions. Each module consists of a 500 × 19 × 7 mm3 scintillator strip made of a commercially available material called EJ-230, with a photomultiplier tube (PMT) connected at each side. Photons are registered via the Compton effect and each analog signal from the PMTs is sampled in the voltage domain at four thresholds by dedicated field-programmable gate arrays.

In addition to recording the location and time of the electron—positron annihilation, J-PET determines the energy deposited by annihilation photons. The 2D position of a hit is known from the scintillator position, while the third space component is calculated from the time difference of signals arriving at both ends of scintillator, enabling direct 3D image reconstruction. PMTs connected to both sides of the scintillator strips compensate for the low detection efficiency of plastic compared to crystal scintillators and enable multi-layer detection. A modular and relatively easy to transport PET scanner with a non-magnetic and low density central part can be used as a magnetic resonance imaging (MRI) or computed-tomography compatible insert. Furthermore, since plastic scintillators are produced in various shapes, the J-PET approach can be also introduced for positron emission mammography (PEM) and as a range monitor for hadron therapy.

The J-PET detector offers a powerful new tool to test fundamental symmetries

J-PET can also build images from positronium (a bound state of electron and positron) that gets trapped in intermolecular voids. In about 40% of cases, positrons injected into the human body create positronium with a certain lifetime and other environmentally sensitive properties. Currently this information is neither recorded nor used for PET imaging, but recent J-PET measurements of the positronium lifetime in normal and cancer skin cells indicate that the properties of positronium may be used as diagnostic indicators for cancer therapy. Medical doctors are excited by the avenues opened by J-PET. These include a larger axial view (e.g. to check correlations between organs separated by more than 20  cm in the axial direction), the possibility of performing combined PET-MRI imaging at the same time and place, and the possibility of simultaneous PET and positronium (morphometric) imaging paving the way for in vivo determination of cancer malignancy.

Such a large detector is not only potentially useful for medical applications. It can also be used in materials science, where PALS enables the study of voids and defects in solids, while precise measurements of positronium atoms leads to morphometric imaging and physics studies. In this latter regard, the J-PET detector offers a powerful new tool to test fundamental symmetries.

Combinations of discrete symmetries (charge conjugation C, parity P, and time reversal T) play a key role in explaining the observed matter–antimatter asymmetry in the universe (CP violation) and are the starting point for all quantum field theories preserving Lorentz invariance, unitarity and locality (CPT symmetry). Positronium is a good system enabling a search for C, T, CP and CPT violation via angular correlations of annihilation quanta, while the positronium lifetime measurement can be used to separate the ortho- and para-positronium states (o-Ps and p-Ps). Such decays also offer the potential observation of gravitational quantum states, and are used to test Lorentz and CPT symmetry in the framework of the Standard Model Extension.

At J-PET, the following reaction chain is predominantly considered: 22Na 22Ne e+ νe, 22Ne 22Ne γ and e+e o-Ps 3γ annihilation. The detection of 1274 keV prompt γ emission from 22Ne de-excitation is the start signal for the positronium-lifetime measurement. Currently, tests of discrete symmetries and quantum entanglement of photons originating from the decay of positronium atoms are the main physics topics investigated by the J-PET group. The first data taking was conducted in 2016 and six data-taking campaigns have concluded with almost 1 PB of data. Physics studies are based on data collected with a point-like source placed in the centre of the detector and covered by a porous polymer to increase the probability of positronium formation. A test measurement with a source surrounded by an aluminium cylinder was also performed. The use of a cylindrical target (figure 1, left) allows researchers to separate in space the positronium formation and annihilation (cylinder wall) from the positron emission (source). Most recently, measurements by J-PET were also performed with a cylinder with the inner wall covered by the porous material.

Figure 1

The J-PET programme aims to beat the precision of previous measurements for C, CP and CPT symmetry tests in positronium, and to be the first to observe a potential T-symmetry violation. Tests of C symmetry, on the other hand, are conducted via searches for forbidden decays of the positronium triplet state (o-Ps) to 4γ and the singlet state (p-Ps) to 3γ. Tests of the other fundamental symmetries and their combinations will be performed by the measurement of the expectation values of symmetry-odd operators constructed using spin of o-Ps, momenta and polarisation vectors of photons originating from its annihilation (figure 1, right). The physical limit of such tests is expected at the level of about 10−9 due to photo–photon interaction, which is six orders of magnitude smaller than the present experimental limits (e.g. at the University of Tokyo and by the Gammasphere experiment).

Since J-PET is built of plastic scintillators, it provides an opportunity to determine the photon’s polarisation through the registration of primary and secondary Compton scatterings in the detector. This, in turn, enables the study of multi-partite entanglement of photons originating from the decays of positronium atoms. The survival of particular entanglement properties in the mixing scenario may make it possible to extract quantum information in the form of distinct entanglement features, e.g. from metabolic processes in human bodies.

Currently a new, fourth J-PET layer is under construction (figure 2), with a single unit of the layer comprising 13 plastic-scintillator strips. With a mass of about 2 kg per single detection unit, it is easy to transport and to build on-site a portable tomographic chamber whose radius can be adjusted for different purposes by using a given number of such units.

Figure 2

The J-PET group is a collaboration between several Polish institutions – Jagiellonian University, the National Centre for Nuclear Research Świerk and Maria Curie-Skłodowska University – as well as the University of Vienna and the National Laboratory in Frascati. The research is funded by the Polish National Centre for Research and Development, by the Polish Ministry of Science and Higher Education and by the Foundation for Polish Science. Although the general interest in improved quality of medical diagnosis was the first step towards this new detector for positron annihilation, today the basic-research programme is equally advanced. The only open question at J-PET is whether a high-resolution full human body tomographic image will be presented before the most precise test of one of nature’s fundamental symmetries.

The post J-PET’s plastic revolution appeared first on CERN Courier.

]]>
Feature A PET detector based on plastic scintillators offers whole-body imaging in addition to precision tests of fundamental symmetries. https://cerncourier.com/wp-content/uploads/2018/10/CCNov18_J-PET-frontisHR-1.png
First human 3D X-ray in colour https://cerncourier.com/a/first-human-3d-x-ray-in-colour/ Fri, 31 Aug 2018 08:00:33 +0000 https://preview-courier.web.cern.ch/?p=12575 New-Zealand company MARS Bioimaging Ltd has used technology developed at CERN to perform the first colour 3D X-ray of a human body, offering more accurate medical diagnoses.

The post First human 3D X-ray in colour appeared first on CERN Courier.

]]>
3D colour x-ray image

New-Zealand company MARS Bioimaging Ltd has used technology developed at CERN to perform the first colour 3D X-ray of a human body, offering more accurate medical diagnoses. Father and son researchers Phil and Anthony Butler from Canterbury and Otago universities in New Zealand spent a decade building their product using Medipix read-out chips, which were initially developed to address the needs of particle tracking in experiments at the Large Hadron Collider.

The CMOS-based Medipix read-out chip works like a camera, detecting and counting each individual particle hitting the pixels when its shutter is open. The resulting high-resolution, high-contrast images make it unique for medical-imaging applications. Successive generations of chips have been developed during the past 20 years with many applications outside high-energy physics. The latest, Medipix3, is the third generation of the technology, developed by a collaboration of more than 20 research institutes – including the University of Canterbury.

MARS Bioimaging Ltd was established in 2007 to commercialise Medipix3 technology. The firm’s product combines spectroscopic information generated by a Medipix3-enabled X-ray detector with powerful algorithms to generate 3D images. The colours represent different energy levels of the X-ray photons as recorded by the detector, hence identifying different components of body parts such as fat, water, calcium and disease markers.

So far, researchers have been using a small version of the MARS scanner to study cancer, bone and joint health, and vascular diseases that cause heart attacks and strokes. In the coming months, however, orthopaedic and rheumatology patients in New Zealand will be scanned by the new apparatus in a world-first clinical trial. “In all of these studies, promising early results suggest that when spectral imaging is routinely used in clinics it will enable more accurate diagnosis and personalisation of treatment,” said Anthony Butler.

The post First human 3D X-ray in colour appeared first on CERN Courier.

]]>
News New-Zealand company MARS Bioimaging Ltd has used technology developed at CERN to perform the first colour 3D X-ray of a human body, offering more accurate medical diagnoses. https://cerncourier.com/wp-content/uploads/2018/08/CCSep18News-xray-1.png
LHC upgrade brings benefits beyond physics https://cerncourier.com/a/lhc-upgrade-brings-benefits-beyond-physics/ Fri, 31 Aug 2018 09:00:02 +0000 https://preview-courier.web.cern.ch/?p=12636 The high-luminosity LHC promises a quantifiable return to society in terms of scientific, economic and cultural value.

The post LHC upgrade brings benefits beyond physics appeared first on CERN Courier.

]]>
Summer students

CERN is a unique international research infrastructure whose societal impacts go well beyond advancing knowledge in high-energy physics. These do not just include technological spillovers and benefits to industry, or unique inventions such as the World Wide Web, but also the training of skilled individuals and wider cultural effects. The scale of modern particle-physics research is such that single projects, such as the Large Hadron Collider (LHC) at CERN, offer an opportunity to weigh up the returns on public investment in fundamental science.

Recently, the European Commission (EC) introduced requirements for large research infrastructures to estimate their socioeconomic impact. A quantitative estimate can be obtained via a social cost–benefit analysis (CBA), a well-established methodology in economics. Successfully passing a social CBA test is required for co-financing major projects with the European Regional Development Fund and the Cohesion Fund. The EC’s Horizon 2020 programme also specifically mentions that the preparatory phase of new projects that are members of the European Strategy Forum on Research Infrastructures (ESFRI) should include a social CBA.

Fig. 1.

Against this background, our team at the University of Milan in Italy was invited by CERN’s Future Circular Collider (FCC) study to carry out a social CBA of the high-luminosity LHC (HL-LHC) upgrade project, also preparing the ground for further analysis of larger, post-LHC projects. Involving three years of work and extending an initial study concerning the LHC carried out between 2014 and 2016, the report assesses the HL-LHC’s economic costs and benefits until 2038, once the machine has ceased operations. Here, we summarise the main findings of our analysis, which also includes the most comprehensive survey to date concerning the public’s willingness to pay for CERN investment projects.

Estimating value

Since the aim of the HL-LHC project is to extend the discovery potential of the LHC after 2025, it is also expected to prolong its impact on society. To evaluate such an effect, we require a CBA model that estimates the expected net present value (NPV) of a project at the end of a defined observation period. The NPV is calculated from the net flow of discounted benefits generated by the investment. Uncertainty surrounding the estimation of costs and benefits is tackled with Monte Carlo simulations based on probabilities attached to the variables underlying the analysis. For the HL-LHC, the relevant benefits were taken to be: the value of training for early-stage researchers; technological or industrial spillovers to industry; cultural effects for the public; academic publications for scientists; and the public-good value for citizens (figure 1). A research infrastructure passes the CBA test when, over time, the cumulated benefits exceed its costs for society, i.e. when the expected NPV is greater than zero. It is the methodology of a CBA not to account for scientific discoveries and results, since the aim of such studies is to quantify extra benefits that come from this type of investment.

Fig. 2.

Two scenarios were considered: a baseline scenario with the HL-LHC upgrade and a counterfactual scenario that includes the operation of the LHC until the end of its life without the upgrade. In both scenarios, the total costs include past and future expenditures attributed to the LHC accelerator complex and by the four main LHC experiment collaborations: ATLAS, CMS, LHCb and ALICE. The difference between the total cost (which includes capital and operational expenditures) in the two scenarios is about 2.9 billion Swiss francs.

HL-LHC benefits

For the HL-LHC, one of the most significant benefits, representing at least a third of the total, was the value of training for early-stage researchers (figure 2). It was shown that the 2038 cohort of early-stage researchers will enjoy a “salary premium” due to their experience at the HL-LHC or LHC until 2080, as confirmed by surveys of students, formers students and more than 330 team leaders.

Fig. 3.

The economic benefit from industrial spillovers, software and communication technologies is another major factor, together representing 40% of the project’s total benefits. Software and communication technology represents 24% of the total benefits in this category, while the rest  comes from the additional profits for high-tech companies involved in the HL-LHC (figure 3). We looked at the value of hi-tech procurement contracts for the HL-LHC, drawing from three different empirical analyses: an econometric study of the company accounts in the long-term, before and after the first contract with CERN; a survey of more than 650 CERN suppliers; and 28 case studies. In the case of HL-LHC, incremental profits for firms represent 16% of the total benefits from sales to customers other than CERN, and this percentage increases to 29% if we consider the difference between HL-LHC and the counterfactual scenario of no HL-LHC upgrade.

CERN and society

Cultural effects, while uncertain because they depend on future announcements of discoveries and communication strategies, were estimated to contribute 13% to the total HL-LHC benefits. More than half of this comes from onsite visitors to CERN and its travelling exhibitions.

Contributing just 2% of the total benefits in the HL-LHC scenario, scientific publications (relating to their quantity and citations, not their contents) represent the smallest overall socioeconomic benefit category. This is expected given the relatively small size of the high-energy physics community compared to other social groups.

The public-good value of HL-LHC, estimated to be 12% of the total, was inferred from a survey of taxpayers’ willingness to pay for a research infrastructure such as CERN. A first estimate was carried out in our assessment of the LHC benefits published in 2016, but recently we have refined this estimate based on an extensive survey in one of CERN’s two host states, France (see box). A similar survey is planned in CERN’s other host state, Switzerland.

Fig. 4.

Taking all this into account, including the uncertainties in critical variables and relying on Monte Carlo simulations to estimate the probabilities of costs, benefits and the NPV of the project, our analysis showed that the HL-LHC has a clear, quantifiable economic benefit for society (figure 4). Overall, the ratio between the incremental benefits and incremental costs of the HL-LHC with respect to the continued operation of the LHC under normal consolidation (i.e. without high-luminosity upgrade) is 1.8. This means that each Swiss franc invested in the HL-LHC upgrade project pays back approximately 1.8 Swiss francs in societal benefits, mainly stemming from the value of the skills acquired by students and postdocs, and from industrial spillovers. The study is also based on very conservative assumptions about the potential benefits.

What conclusions should CERN draw from this analysis? First, given that the benefits to early-stage researchers are the single most important societal benefit, CERN could invest more in activities facilitating the transition to the international job market. Similarly, cooperative relations with suppliers of technology and the development of innovative software, data storage, networking and computing solutions are strategic levers that CERN could use to boost its social benefits. Finally, cultural effects, especially those related to onsite visitors and social media, have great potential for generating societal benefits, hence outreach and
communication strategies are important.

Exhibition

There are also lessons regarding CERN’s investments in future particle accelerators. The HL-LHC project yields significant socio-economic value, well in excess of its costs and in addition to its scientific output. Extrapolating these results, it can be expected that future colliders at CERN, like those considered by the FCC study, would bring the same kind of social benefits, but on a bigger scale. Further research is needed on the socio-economic impact of new long-term investment scenarios.

The post LHC upgrade brings benefits beyond physics appeared first on CERN Courier.

]]>
Feature The high-luminosity LHC promises a quantifiable return to society in terms of scientific, economic and cultural value. https://cerncourier.com/wp-content/uploads/2018/08/CCSep18Cost-exhibition.png
Big science meets industry in Copenhagen https://cerncourier.com/a/big-science-meets-industry-in-copenhagen/ Fri, 23 Mar 2018 15:42:19 +0000 https://preview-courier.web.cern.ch?p=13368 The Big Science Business Forum (BSBF), held in Copenhagen, Denmark, saw delegates discuss opportunities in the current big-science landscape

The post Big science meets industry in Copenhagen appeared first on CERN Courier.

]]>

Big science equals big business, whether it is manufacturing giant superconducting magnets for particle colliders or perfecting mirror coatings for space telescopes. The Big Science Business Forum (BSBF), held in Copenhagen, Denmark, on 26–28 February, saw more than 1000 delegates from more than 500 companies and organisations spanning 30 countries discuss opportunities in the current big-science landscape.

Nine of the world’s largest research facilities – CERN, EMBL, ESA, ESO, ESRF, ESS, European XFEL, F4E and ILL – offered insights into procurement opportunities and orders totalling more than €12 billion for European companies in the coming years. These range from advisory engineering work and architectural tasks to advanced technical equipment, construction projects and radiation-resistant materials. A further nine organisations also joined the conference programme: ALBA, DESY, ELI-NP, ENEA, FAIR, MAX IV, SCK•CEN – MYRRHA, PSI and SKA, thereby gathering 18 of the world’s most advanced big-science organisations under one roof.

The big-science market is currently fragmented by the varying quality standards and procurement procedures of the different laboratories, delegates heard. BSBF aspired to offer a space to discuss the entry challenges for businesses and suppliers – including small- and medium-sized enterprises – who can be valuable business partners for big-science projects.

“The vision behind BSBF is to provide an important stepping stone towards establishing a stronger, more transparent and efficient big-science market in Europe and we hope that this will be the first of a series of BSBFs in different European cities,” said Agnete Gersing of the Danish ministry for higher education and science during the opening address.

Around 700 one-to-one business meetings took place, and delegates also visited the European Spallation Source and MAX IV facility just across the border in Lund, Sweden. Parallel sessions covered big science as a business area, addressing topics such as the investment potential and best practices of Europe’s big-science market.

“Much of the most advanced research takes place at big-science facilities, and their need for high-tech solutions provides great innovation and growth opportunities for private companies,” said Danish minister for higher education and science, Søren Pind.

The post Big science meets industry in Copenhagen appeared first on CERN Courier.

]]>
Meeting report The Big Science Business Forum (BSBF), held in Copenhagen, Denmark, saw delegates discuss opportunities in the current big-science landscape https://cerncourier.com/wp-content/uploads/2018/03/CCApr18_FP-bigscience.jpg
CERN and Member States talk med-tech https://cerncourier.com/a/cern-and-member-states-talk-med-tech/ Mon, 15 Jan 2018 16:04:19 +0000 https://preview-courier.web.cern.ch?p=13376 The first annual knowledge-transfer thematic forum on medical applications takes place at CERN

The post CERN and Member States talk med-tech appeared first on CERN Courier.

]]>
3D colour X-ray imaging of a mouse

The first annual knowledge-transfer thematic forum on medical applications took place at CERN on 30 November, bringing CERN and its Member State and associate Member State representatives together to discuss the application of CERN’s technologies and know-how to the medical field.

The knowledge transfer (KT) forum, known as ENET until the end of 2015 comprises one or more representatives for each country, allowing CERN to develop common approaches with its Member States and to identify potential industry and academic partners while minimising duplication of effort. Medical applications are one of CERN’s most significant KT activities, and this year CERN gave each country the chance to nominate an expert in the field to attend special sessions of the KT forum dedicated to medical applications.

Some 20 invited speakers from the physics and medical communities took part in the inaugural event in November. The scope of the discussions demonstrated CERN’s deep and longstanding involvement in areas such as medical imaging, hadron therapy and computing, and highlighted the enormous potential for future applications of high-energy physics technologies to the medical arena.

After an introduction regarding CERN’s strategy for medical applications and the governance put in place for these activities (see “Viewpoint”), much of the event was devoted to updates from individual Member States and associate Member States, where much activity is taking place. Some of them clearly indicated that medical applications are an important activity in their countries, and that engaging with CERN more closely is of great added value to such efforts.

In the second half of the meeting, presentations from CERN experts introduced the various technology fields in which CERN is already actively pursuing the application of its technologies to the medical fields, such as high-field superconducting magnets, computing and simulations, and high-performance particle detectors.

The event was an all-round success, and more will follow this year to continue identifying ways in which CERN can contribute to the medical applications strategy of its Member States.

The post CERN and Member States talk med-tech appeared first on CERN Courier.

]]>
Meeting report The first annual knowledge-transfer thematic forum on medical applications takes place at CERN https://cerncourier.com/wp-content/uploads/2018/01/CCfac14_01_18.jpg
Novartis acquires CERN spin-off https://cerncourier.com/a/novartis-acquires-cern-spin-off/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/novartis-acquires-cern-spin-off/ Global healthcare company Novartis has announced plans to acquire Advanced Accelerator Applications (AAA), a spin-off radiopharmaceutical firm established by former CERN physicist Stefano Buono in 2002. With an expected price of $3.9B, said the firm in a statement, the acquisition will strengthen Novartis’ oncology portfolio by introducing a new therapy platform for tackling neuroendocrine tumours. […]

The post Novartis acquires CERN spin-off appeared first on CERN Courier.

]]>

Global healthcare company Novartis has announced plans to acquire Advanced Accelerator Applications (AAA), a spin-off radiopharmaceutical firm established by former CERN physicist Stefano Buono in 2002. With an expected price of $3.9B, said the firm in a statement, the acquisition will strengthen Novartis’ oncology portfolio by introducing a new therapy platform for tackling neuroendocrine tumours. Trademarked Lutathera, and based on the isotope lutetium-177, the technology was approved in Europe in September 2017 for the treatment of certain neuroendocrine tumours and is under review in the US.

With its roots in nuclear-physics expertise acquired at CERN, AAA started its commercial activity with the production of radiotracers for medical imaging. The successful model made it possible for AAA to invest in nuclear research to produce innovative radiopharmaceuticals. “We believe that the combination of our expertise in radiopharmaceuticals and theragnostic strategy together with the global oncology experience and infrastructure of Novartis, provide the best prospects for our patients, physicians and employees, as well as the broader nuclear medicine community,” said Buono, who is CEO of AAA.

The post Novartis acquires CERN spin-off appeared first on CERN Courier.

]]>
News https://cerncourier.com/wp-content/uploads/2018/06/CCnew4_01_18-1.jpg
Therapeutic particles https://cerncourier.com/a/therapeutic-particles/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/therapeutic-particles/ The accelerator technology underpinning Europe’s first particle-therapy facilities was driven by the TERA Foundation during the past 25 years.

The post Therapeutic particles appeared first on CERN Courier.

]]>

Last September the TERA Foundation – dedicated to the study and development of accelerators for particle therapy – celebrated its 25th anniversary. Led by visionary Italian physicist Ugo Amaldi, TERA gathered and trained hundreds of brilliant scientists who carried out research on accelerator physics. This culminated in the first carbon-ion facility for hadron therapy in Italy, and the second in Europe: the National Centre for Cancer Hadron Therapy (CNAO), located in Pavia, which treated its first patient in 2011.

The forerunner to CNAO was the Heidelberg Ion-Beam Therapy Centre (HIT) in Germany, which treated its first patient in 2009 following experience accumulated over 12 years in a pilot project at GSI near Darmstadt. After CNAO came the Marburg Ion-Beam Therapy Centre (MIT) in Germany, which has been operational since 2015, and MedAustron in Wiener Neustadt, Austria, which delivered its first treatment in December 2016.

While conventional radiotherapy based on beams of X-rays or electrons is already widespread worldwide, the treatment of cancer with charged particles has seen significant growth in recent years. The use of proton beams in radiation oncology was first proposed in 1946 by Robert Wilson, a student of Ernest Lawrence and founding director of Fermilab. The key advantage of proton beams over X-rays is that the absorption profile of protons in matter exhibits a sharp peak towards the end of their path, concentrating the dose on the tumour target while sparing healthy tissues. Following the first treatment of patients with protons at Lawrence Berkeley Laboratory in the US in 1954, treatment centres in the US, the former USSR and Japan gradually appeared. At the same time, interest arose around the idea of using heavier ions, which offer a higher radio-biological effectiveness and, causing more severe damage to DNA, can control the 3% of all tumours that are radioresistant both to X-rays and protons. It is expected that by 2020 there will be almost 100 centres delivering particle therapy around the world, with more than 30 of them in Europe (see “The changing landscape of cancer therapy”).

Europe entered the hadron-therapy field in 1987, when the European Commission launched the European Light Ion Medical Accelerator (EULIMA) project to realise a particle-therapy centre. The facility was not built in the end, but interest in the topic continued to grow. In 1991, together with Italian medical physicist Giampiero Tosi, Amaldi wrote a report outlining the design of a hospital facility for therapy with light ions and protons to be built in Italy. One year later, the pair established the TERA Foundation to raise the necessary funding to employ students and researchers to work on the project. Within months, TERA could count on the work of about 100 physicists, engineers, medical doctors and radiobiologists, who joined forces to design a synchrotron for particle therapy and the beamlines and monitoring systems necessary for its operation.

Ten years of ups and downs followed, during which TERA scientists developed three designs for a proton-therapy facility initially to be built in Novara, then in the outskirts of Milan and finally in Pavia. Political, legislative and economic issues delayed the project until 2001 when, thanks to the support of Italian health minister and oncologist Umberto Veronesi, the CNAO Foundation was created. The construction of the actual facility began four years later.

“We passed through hard times and we had to struggle, but we never gave up,” says Amaldi. “Besides, we kept ourselves busy with improving the design of our accelerator.”

Introducing PIMMS

Meanwhile, in Austria, experimental physicist Meinhard Regler had launched a project called Austron – a sort of precursor to the European Spallation Source. In 1995, together with the head designer – accelerator physicist Phil Bryant – he proposed the addition of a ring to the facility that would be used for particle therapy (and led to the name of the project being changed to MedAustron). Amaldi, Regler and Bryant then decided to work on a common project, and the “Proton-Ion Medical Machine Study” (PIMMS) was created. Developed at CERN between 1996 and 2000 under the leadership of Bryant and with the collaboration of several CERN physicists and engineers, PIMMS aimed to be a toolkit for any European country interested in building a proton–ion facility for hadron therapy. Rather than being a blueprint for a final facility on a specific site, it was an open study from which different parts could be included in any hadron-therapy centre according to its specific needs.

The design of CNAO itself is based on the PIMMS project, with some modifications introduced by TERA to reduce the footprint of the structure. The MedAustron centre, designed in the early 2000s, also drew upon the PIMMS report. Built between 2011 and 2013, with the first beam extracted by the synchrotron in autumn 2014, MedAustron received official certification as a centre for cancer therapy in December 2016 and, a few days after, treated its first patient. “In the past few years we have worked hard to provide the MedAustron trainees with a unique opportunity to acquire CERN’s know-how in the diverse fields of accelerator design, construction and operation,” says Michael Benedikt of CERN, who led the MedAustron accelerator project. Synergies with other CERN projects were also created, he explains. “The vacuum control system built for MedAustron was successfully used in the Linac4 test set-up, while in the synchrotron a novel radiofrequency system that was jointly developed for the CERN PS Booster and MedAustron is used. The synchrotron’s power converter control uses the same top-notch technology as CERN’s accelerators, while its control system and several of its core components are derived from technologies developed for the CMS experiment.”

All the existing facilities using hadrons for cancer therapy are based on circular cyclotrons and synchrotrons. For some years, however, the TERA Foundation has been working on the design of a linear accelerator for hadron therapy. As early as 1993, Amaldi set up a study group, in collaboration with the Italian institutions ENEA and INFN, dedicated to the design of a linac for protons that would run at the same frequency (3 GHz) as the electron linacs used for conventional radiotherapy. The linac could use a cyclotron as an injector, making it a hybrid solution called a cyclinac, which reduces the sizes of both accelerators while allowing the beam energy to be rapidly changed from pulse to pulse by acting on the radiofrequency system of the linac. In 1998 a 3 GHz 1.2 metre-long linac booster (LIBO) was built by a TERA–CERN–INFN collaboration led by retired CERN engineer Mario Weiss, and in 2001 it was connected to the cyclotron of the INFN South Laboratories in Catania where it accelerated protons from 62 MeV to 74 MeV. This was meant to be the first of 10 modules that would kick protons to 230 MeV.

Linear ambition

In 2007 a CERN spin-off company called ADAM (Applications of Detectors and Accelerators to Medicine) was founded by businessman Alberto Colussi to build a commercial high-frequency linac based on the TERA design. Under the leadership of Stephen Myers, a former CERN director for accelerators and technology and initiator of the CERN medical applications office, ADAM is now completing the first prototype. It is called Linac for Image Guided Hadron Therapy (LIGHT), and the full accelerator comprises: a proton source; a novel 750 MHz RF quadrupole (RFQ) – designed by CERN – which takes the particles up to 5 MeV; four side-coupled drift-tube linacs (SCDTL) – designed by ENEA – to accelerate the beam from 5–37.5 MeV; and a different type of accelerating module, called coupled-cavity linac (CCL) – the LIBO designed by TERA – which gives the final kick to the beam from 37.5 to 230 MeV. The complex will be 24 m long, similar to the circumference of a proton synchrotron.

Compared to cyclotrons and synchrotrons, linear accelerators are lighter and potentially less costly because they are modular. Most importantly, they produce a beam much more suited to treat patients, in particular when the tumour is moving, as in the lungs. The machine developed by ADAM is modular in structure to make it easier to maintain and more flexible when it comes to upgrading or customising the system. In addition, thanks to an active longitudinal modulation system, the beam energy can be varied during therapy and thus the treatment depth changed. LIGHT also has a dynamic transversal modulation system, allowing the beam to be rapidly and precisely modulated to “paint the tumour” many times in a short time – in other words, delivering a homogeneous dose to the whole cancerous tissue while minimising the irradiation of healthy organs. The energy variation of cyclotrons and synchrotrons is 20–100 times slower.

“The beauty of the linac is that you can electronically modulate its output energy,” Myers explains. “Since our accelerator is modular, the energy can be changed either by switching off some of the units or by reducing the power in all of them, or by re-phasing the units. Another big advantage of the linac is that it has a small emittance, i.e. beam size, which translates into smaller, lighter and cheaper magnets and allows to have a simpler and lighter gantry as well.” In the last decade, LIBO has inspired other TERA projects. Its scientists have designed a linac booster for carbon ions (while LIBO was only for protons) and a compact single-room facility called TULIP, in which a 7 m-long proton linac is mounted on a rotating gantry.

The new frontier of hadron therapy, however, could be helium ion treatment. Some tests with these ions were done in the past, but the technique still has to be proven. TERA scientists are currently working on a new accelerator for helium ions, says Amaldi. “Helium can bring great benefit to medical treatments: it is lighter than carbon, thus requiring a smaller accelerator, and it has much less lateral scattering than protons, resulting in sharper lateral fall-offs next to organs at risk.” In order to accelerate helium ions with a linac, we need either a longer linac compared to the one used for protons or higher gradients, as demonstrated by high-energy physics research at CERN and elsewhere in Europe. The need for future, compact and cost-effective ion-therapy accelerators is being addressed by a new collaborative design study coordinated by Maurizio Vretenar and Alessandra Lombardi of CERN, dubbed “PIMMS2”. A proposal, which includes a carbon linac, is being prepared for submission to the CERN Medical Application group, potentially opening the next phase of TERA’s impressive journey.

The post Therapeutic particles appeared first on CERN Courier.

]]>
Feature The accelerator technology underpinning Europe’s first particle-therapy facilities was driven by the TERA Foundation during the past 25 years. https://cerncourier.com/wp-content/uploads/2018/06/CCter1_01_18.jpg
Isotopes for precision medicine https://cerncourier.com/a/isotopes-for-precision-medicine/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/isotopes-for-precision-medicine/ CERN’s MEDICIS facility is producing novel radioisotopes for medical research.

The post Isotopes for precision medicine appeared first on CERN Courier.

]]>

The use of radioisotopes to treat cancer goes back to the late 19th century, with the first clinical trials taking place in France and the US at the beginning of the 20th century. Great strides have been made, and today radioisotopes are widely used by the medical community. Produced mostly in dedicated reactors, radioisotopes are used in precision medicine, both to diagnose cancers and other diseases, such as heart irregularities, as well as to deliver very small radiation doses exactly where they are needed to avoid destroying the surrounding healthy tissue.

However, many currently available isotopes do not combine the most appropriate physical and chemical properties and, in the case of certain tumours, a different type of radiation could be better suited. This is particularly true of the aggressive brain cancer glioblastoma multiforme and of pancreatic adenocarcinoma. Although external beam gamma radiation and chemotherapy can improve patient survival rates, there is a clear need for novel treatment modalities for these and other cancers.

On 12 December, a new facility at CERN called MEDICIS produced its first radioisotopes: a batch of terbium (155Tb), which is part of the 149/152/155/161Tb family considered a promising quadruplet suited for both diagnosis and treatment. MEDICIS is designed to produce unconventional radioisotopes with the right properties to enhance the precision of both patient imaging and treatment. It will expand the range of radioisotopes available – some of which can be produced only at CERN – and send them to hospitals and research centres in Switzerland and across Europe for further study.

Initiated in 2010 by CERN with contributions from the Knowledge Transfer Fund, private foundations and partner institutes, and also benefitting from a European Commission Marie Skłodowska-Curie training grant titled MEDICIS-Promed, MEDICIS is driven by CERN’s Isotope Mass Separator Online (ISOLDE) facility. ISOLDE has been running for 50 years, producing 1300 different isotopes from 73 chemicals for research in many areas including fundamental nuclear research, astrophysics and life sciences.

Although ISOLDE already produces isotopes for medical research, MEDICIS will more regularly produce isotopes with specific types of emission, tissue penetration and half-life – all purified based on expertise acquired at ISOLDE. This will allow CERN to provide radioisotopes meeting the requirements of the medical research community as a matter of course.

ISOLDE directs a high-intensity proton beam from the Proton Synchrotron Booster onto specially developed thick targets, yielding a large variety of atomic fragments. Different devices are used to ionise, extract and separate nuclei according to their masses, forming a low-energy beam that is delivered to various experimental stations. MEDICIS works by placing a second target behind ISOLDE’s: once the isotopes have been produced on the MEDICIS target, an automated conveyor belt carries them to a facility where the radioisotopes of interest are extracted via mass separation and implanted in a metallic foil. The final product is then delivered to local research facilities including the Paul Scherrer Institute, the University Hospital of Vaud and Geneva University Hospitals.

Clinical setting

Once in a medical-research environment, researchers dissolve the isotope and attach it to a molecule, such as a protein or sugar, which is chosen to target the tumour precisely. This makes the isotope injectable, and the molecule can then adhere to the tumour or organ that needs imaging or treating. Selected isotopes will first be tested in vitro, and in vivo by using mouse models of cancer. Researchers will test the isotopes for their direct effect on tumours and when they are coupled to peptides with tumour-homing capacities, and establish new delivery methods for brachytherapy using stereotactic or robotic-assisted surgery in large-animal models for their capacity to target glioblastoma or pancreatic adenocarcinoma or neuroendocrine tumour cells.

MEDICIS is not just a world-class facility for novel radioisotopes. It also marks the entrance of CERN into the growing field of theranostics, whereby physicians verify and quantify the presence of cellular and molecular targets in a given patient with a diagnostic radioisotope, before treating the disease with the therapeutic radioisotope. The prospect of a dedicated facility at CERN for the production of innovative isotopes, together with local leading institutes in life and medical sciences and a large network of laboratories, gives MEDICIS an exciting scientific programme in the years to come. It is also a prime example of the crossover between fundamental physics research and health applications, with accelerators set to play an increasing role in the production of life-changing medical isotopes.

The post Isotopes for precision medicine appeared first on CERN Courier.

]]>
Feature CERN’s MEDICIS facility is producing novel radioisotopes for medical research. https://cerncourier.com/wp-content/uploads/2018/06/CCmed1_01_18.jpg
The changing landscape of cancer therapy https://cerncourier.com/a/the-changing-landscape-of-cancer-therapy/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/the-changing-landscape-of-cancer-therapy/ Proton and ion therapy set to transform global cancer treatment

The post The changing landscape of cancer therapy appeared first on CERN Courier.

]]>

Cancer is a critical societal issue. Worldwide, in 2012 alone, 14.1 million cases were diagnosed, 8.2 million people died and 32.5 million people were living with cancer. These numbers are projected to rise by 2030 to reach 24.6 million newly diagnosed patients and projected deaths of 13 million. While the rate of cancer diagnoses is growing only steadily in the most developed countries, less developed countries can expect a two-fold increase in the next 20 years or so. The growing economic burden imposed by cancer – amounting to around $2 trillion worldwide in 2010 – is putting considerable pressure on public healthcare budgets.

Radiotherapy, in which ionising radiation is used to control or kill malignant cells, is a fundamental component of effective cancer treatment. It is estimated that about half of cancer patients would benefit from radiotherapy for treatment of localised disease, local control, and palliation. The projected rise in cancer cases will place increased demand on already scarce radiotherapy services worldwide, particularly in less developed countries.

In 2013, member states of the World Health Organisation agreed to develop a global monitoring framework for comprehensive, non-communicable diseases (NCDs). The aim is to reduce premature mortality from cardiovascular and chronic respiratory diseases, cancers and diabetes by 25%, relative to 2010 levels, which means 1.5 million deaths from cancer will need to be prevented each year.

Advanced cancer therapy techniques based on beams of protons or ions are among several tools that are expected to play a significant role in this effort (see “Therapeutic particles”). In addition, advanced imaging and detection technologies for high-energy physics research – many being driven by CERN and the physics community – are needed. These include in-beam positron emission tomography (PET) and prompt-gamma imaging, and treatment planning based on the latest Monte Carlo simulation codes.

Optimal dose

The main goal of radiotherapy is to maximise the damage to the tumour while minimising the damage to the surrounding healthy tissue, thereby reducing acute and late side effects. The most frequently used radiotherapy modalities use high-energy (MeV) photon or electron beams. Conventional X-ray radiation therapy is characterised by almost exponential attenuation and absorption, delivering the maximum energy near the beam entrance, but continuing to deposit significant energy at distances beyond the cancer target. The maximum energy deposition, for X-ray beams with energy of about 8 MeV, is reached at a depth of 2–3 cm in soft tissue. To deliver dose optimally to the tumour, while protecting surrounding healthy tissues, radiotherapy has progressed rapidly with the development of new technologies and methodologies. The latest developments include MRI-guided radiotherapy, which combines simultaneous use of MRI-imaging and photon irradiation. Such advanced radiation therapy modalities are becoming increasingly important and offer new opportunities to treat different cancers, in particular the combination with other emerging areas such as cancer-immunotherapy and the integration of sequencing data, with clinical-decision support systems for personalised medicine.

However, if one looks at the dose deposition profile of photons compared to other particles (figure 1), the conspicuous feature of this graph is that, in the case of protons and carbon ions, a significant fraction of the energy is deposited in a narrow depth range near the endpoint of the trajectory, after which very little energy is deposited. It was precisely these differences in dose – the so-called Bragg-peak effect – that led visionary physicist and founder of Fermilab, Robert Wilson, to propose the use of hadrons for cancer treatment in 1946.

Several advantages

Hadron or particle therapy is a precise form of radiotherapy that uses charged particles instead of X-rays, to deliver a dose of radiotherapy to patients. Radiation therapy with hadrons or particles (protons and other light ions) offers several advantages over X-rays: not only do hadrons and particles deposit most of their energy at the end of their range, but particle beams can be shaped with great precision. This allows for more accurate treatment of the tumour, destroying the cancer cells more precisely with minimal damage to surrounding tissue. Radiotherapy using the unique physical and radiobiological properties of charged hadrons, also allows highly conformal treatment of various kinds of tumours, in particular those that are radio-resistant.

Over the past two decades, particle-beam cancer therapy has gained huge momentum. Many new centres have been built, and many more are under construction (figure 2). At the end of 2016 there were 67 centres in operation worldwide and another 63 are in construction or in the planning stage. Most of these are proton centres: 25 in US (protons only); 19 in Europe (three dual centres); 15 in Japan (four carbon and one dual); three (one carbon and one dual) in China; and four in other parts of the world. By 2021 there will be 130 centres operating in nearly 30 countries. European centres are shown in figure 3, while figure 4 shows that the cumulated number of treated patients is growing almost exponentially.

At the end of 2007, 61,855 patients had been treated (53,818 with protons and 4,450 with carbon ions). At the end of 2016 the number had grown to 168,000 (145,000 with protons and 23,000 with carbon ions). This is due primarily to the greater availability of dedicated centres able to meet the growing demand for this particular form of radiotherapy, and most probably in future it will have a larger growth rate, with an increase of the patient throughput per centre.

Particle-physics foundation

High-energy physics research has played a major role in initiating, and now expanding, the use of particle therapy. The first patient was treated at Berkeley National Laboratory in the US with hadrons in September 1954 – the same year CERN was founded – and was made possible by the invention of the cyclotron by Ernest Lawrence and subsequent collaboration with his medical-doctor brother, John. The first hospital-based, particle-therapy centres opened in 1989 at  Clatterbridge in the UK and in 1990 at the Loma Linda University Medical Center in the US. Before this time, all research related to hadron therapy and patient treatment was carried out in particle-physics labs.

In addition to the technologies and research facilities coming from the physics community, the culture of collaboration at the heart of organisations such as CERN is finding its way into other fields. This has inspired the European Network for Light Ion Therapy (ENLIGHT) to promote international discussions and collaboration in the multidisciplinary field of hadron therapy, which has now been running for 15 years (see “Networking against cancer”).

Were it not for the prohibitively large cost of installing proton-therapy treatment in hospitals, it would be the treatment of choice for most patients with localised tumours. Proton-therapy technology is significantly more compact today than it once was, but when combined with the gantry and other necessary equipment, even the most compact systems on the market occupy an area of a couple of hundred square metres. Most hospitals lack the financial resources and space to construct a special building for proton therapy, so we need to make facilities smaller and cheaper, with costs of around $5–10 million for a single room, similar to state-of-the-art photon-therapy systems. An ageing population, and the need for a more patient-specific approach to cancer treatment and other age-related diseases, present major challenges for future technologies to control rising health costs, while continuing to deliver better outcomes for patients. Scientists working at the frontiers of particle physics have much to contribute to these goals, and the culture of collaboration will ensure that breakthrough technologies find their way into the medical clinics of the future.

The post The changing landscape of cancer therapy appeared first on CERN Courier.

]]>
Feature Proton and ion therapy set to transform global cancer treatment https://cerncourier.com/wp-content/uploads/2018/06/CChad4_01_18.jpg
Bridging the gap https://cerncourier.com/a/bridging-the-gap/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/bridging-the-gap/ Working towards medical linacs for challenging environments

The post Bridging the gap appeared first on CERN Courier.

]]>

If you live in a low- or middle-income country (LMIC), your chances of surviving cancer are significantly lower than if you live in a wealthier economy. That’s largely due to the availability of radiation therapy (see “The changing landscape of cancer therapy”). Between 2015 and 2035, the number of cancer diagnoses worldwide is expected to increase by 10 million, with around 65% of those cases in poorer economies. Approximately 12,600 new radiotherapy treatment machines and up to 130,000 trained oncologists, medical physicists and technicians will be needed to treat those patients.

Experts in accelerator design, medical physics and oncology met at CERN on 26–27 October 2017 to address the technical challenge of designing a robust linear accelerator (linac) for use in more challenging environments. Jointly organised by CERN, the International Cancer Expert Corps (ICEC) and the UK Science and Technology Facilities Council (STFC), the workshop was funded through the UK Global Challenges Research Fund, enabling participants from Botswana, Ghana, Jordan, Nigeria and Tanzania to share their local knowledge and perspectives. The event followed a successful inaugural workshop in November 2016, also held at CERN (CERN Courier March  2017 p31).

The goal is to develop a medical linear accelerator that provides state-of-the-art radiation therapy in situations where the power supply is unreliable, the climate harsh and/or communications poor. The immediate objective is to develop work plans involving Official Development Assistance (ODA) countries that link to the following technical areas (which correspond to technical sessions in the October workshop): RF power systems; durable and sustainable power supplies; beam production and control; safety and operability; and computing.

Participants agreed that improving the operation and reliability of selected components of medical linear accelerators is essential to deliver better linear accelerator and associated instrumentation in the next three to seven years. A frequent impediment to reliable delivery of radiotherapy in LMICs, and other underserved regions of the world, is the environment within which the sophisticated linear accelerator must function. Excessive ambient temperatures, inadequate cooling of machines and buildings, extensive dust in the dry season and the high humidity in some ODA countries are only a few of the environmental factors that can challenge both the robustness of treatment machines and the general infrastructure.

Simplicity of operation is another significant factor in using linear accelerators in clinics. Limiting factors to the development of radiotherapy in lower-resourced nations don’t just include the cost of equipment and infrastructure, but also a shortage of trained personnel to properly calibrate and maintain the equipment and to deliver high-quality treatment. On one hand, the radiation technologist should be able to set treatments up under the direction of the radiation oncologist and in accordance with the treatment plan. On the other hand, maintenance of the linear accelerators should also be as easy as possible – from remote upgrades and monitoring to anticipate failure of components. These centres, and their machines, should be able to provide treatment on a 24/7 basis if needed, and, at the same time, deliver exclusive first-class treatment consistent with that offered in richer countries. STFC will help to transform ideas and projects presented in the next workshop, scheduled for March 2018, into a comprehensive technology proposal for a novel linear accelerator. This will then be submitted to the Global Challenges Research Fund Foundation Awards 2018 call for further funding. This ambitious project aims to have facilities and staff available to treat patients in low- and middle-income countries within 10 years.

The post Bridging the gap appeared first on CERN Courier.

]]>
Feature Working towards medical linacs for challenging environments https://cerncourier.com/wp-content/uploads/2018/06/CCgap1_01_18.jpg
Networking against cancer https://cerncourier.com/a/networking-against-cancer/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/networking-against-cancer/ ENLIGHT: 15 years of promoting hadron therapy in Europe

The post Networking against cancer appeared first on CERN Courier.

]]>

The inaugural meeting of the European Network for Light Ion Hadron Therapy (ENLIGHT) took place at CERN in February 2002, with the aim of co-ordinating European efforts in innovative cancer treatment strategies using radiation. Specialists from different disciplines, including radiation biology, oncology, physics and engineering, with experience and interest in particle therapy have nurtured the network ever since.

Today, ENLIGHT can count on the contribution of more than 700 members from all continents. Together, they identify and tackle the technical challenges related to the use of highly sophisticated machines, train young and specialist researchers, and seek funding to ensure the sustainability and effectiveness of the organisation.

Started with the support of the European Commission (EC), ENLIGHT has coordinated four other EC projects in particle therapy: ULICE, PARTNER, ENVISION and ENTERVISION. In the past 15 years, the network has evolved into an open, collaborative and multidisciplinary platform to establish priorities and assess the effectiveness of various treatment modalities. Initially based on the three technologies and innovation pillars – accelerators, detectors and computing – of high-energy physics, the ENLIGHT initiative has evolved into a global effort.

Training essential

ENLIGHT has witnessed a large increase in dedicated particle therapy centres, and innovative medical imaging techniques are starting to make their way into hospitals. Skilled experts for high-tech cancer treatment are, therefore, in high demand. Thanks to the large number of scientists involved and its wide reach, ENLIGHT has enormous potential to offer education and training and, since 2015, has included training sessions in its annual meetings.

Education and training, in addition to pitching for research funding, are the main thrusts of ENLIGHT’s activities today. A project within the CERN & Society Foundation has just been approved, opening a new chapter for ENLIGHT and its community. The benefits lie, not only in reinforcing the hadron therapy field with qualified multidisciplinary groups of experts, but especially in helping young scientists flourish in the future.

www.cern.ch/ENLIGHT.

The post Networking against cancer appeared first on CERN Courier.

]]>
Feature ENLIGHT: 15 years of promoting hadron therapy in Europe https://cerncourier.com/wp-content/uploads/2018/06/CCenl1_01_18.jpg
Strategic step for medical impact https://cerncourier.com/a/viewpoint-strategic-step-for-medical-impact/ Mon, 15 Jan 2018 09:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-strategic-step-for-medical-impact/ Knowledge transfer has become an established part of CERN’s programme, says Frédérick Bordry.

The post Strategic step for medical impact appeared first on CERN Courier.

]]>

Innovative ideas and technologies from physics have contributed to great advances in medicine, in particular radiation-based medical diagnosis and treatment. Today, state-of-the-art techniques derived from particle physics research are routinely used in clinical practice and medical research centres: from technology for PET scanners and dedicated accelerators for cancer therapy (see The changing landscape of cancer therapy), to simulation and data analysis tools.

Transferring CERN’s know-how to other fields is an integral part of its mission. Over the past 60 years, CERN has developed widely recognised expertise and unique competencies in particle accelerators, detectors and computing. While CERN’s core mission is basic research in particle physics, these “tools of the trade” have found applications in a variety of fields and can have an impact far beyond their initial expectations. An excellent recent example is the completion of CERN MEDICIS, which uses a proton beam to produce radioisotopes for medical research (see “Isotopes for precision medicine”).

Knowledge transfer (KT) for the benefit of medical applications has become an established part of CERN’s programme, formalised within the KT group. CERN has further initiated numerous international and multidisciplinary collaborations, partially or entirely devoted to technologies with applications in the medical field, some of which have been funded by the European Commission (EC). Until recently, the transfer of knowledge and technology from physics to medicine at CERN has essentially been driven by enthusiastic individuals on an ad hoc basis. In light of significant growth in medical applications-related activities, in 2017 CERN published a formal medical applications strategy (approved by the Council in June).

Its aims are to ensure that medical applications-related knowledge transfer activities are carried out without affecting CERN’s core mission of fundamental research, are relevant to the medical community and delivered within a sustainable funding model.

The focus is on R&D projects using technologies and infrastructures that are uniquely available at CERN, seeking to minimise any duplication of efforts taking place in Member States and associate Member States. The most promising CERN technologies and infrastructure that are relevant to the medical domain shall be identified – and the results matched with the requirements of the medical research communities, in particular in CERN’s Member States and associate Member States. Projects shall then be identified, taking into account such things as: maximising the impact of CERN’s engagement; complementarities with work at other laboratories; and the existence of sufficient external funding and resources.

CERN’s medical applications-related activities are co-ordinated by the CERN KT medical applications section, which also negotiates the necessary agreements with project partners. A new KT thematic forum, meanwhile, brings together CERN and Member State representatives to exchange information and ideas about medical applications (see “Faces and Places”). The CERN Medical Applications Steering Committee (CMASC) selects, prioritises, approves and coordinates all proposed medical applications-related projects. The committee receives input from the Medical Applications Project Forum (MAPF), the CERN Medical Applications Advisory Committee (CMAAC) and various KT bodies.

Although CERN can provide a limited amount of seed funding for medical applications projects, external stakeholders must provide the funding needed to deliver their project. Additional funding may be obtained through the EC Framework Programmes, and the CERN & Society Foundation is another potential source.

The transfer of know-how and technologies from CERN to the medical community represents one of the natural vehicles for CERN to disseminate the results of its work to society as widely as possible. The publication of a formal strategy document represents an important evolution of CERN’s program and highlights its commitment to maximise the societal impact of its research and to transfer CERN’s know-how and technology to its Member States and associate Member States.

The post Strategic step for medical impact appeared first on CERN Courier.

]]>
Opinion Knowledge transfer has become an established part of CERN’s programme, says Frédérick Bordry. https://cerncourier.com/wp-content/uploads/2018/06/CCvie1_01_18.jpg
From the web to a start-up near you https://cerncourier.com/a/from-the-web-to-a-start-up-near-you/ Fri, 22 Sep 2017 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/from-the-web-to-a-start-up-near-you/ The core mission of CERN is fundamental research in particle physics. Yet, as a publicly funded laboratory, it also has a remit to ensure that its technology and expertise deliver prompt and tangible benefits to society wherever possible. Other physics-research laboratories and institutes were early adopters of CERN technologies, thanks to the highly collaborative nature […]

The post From the web to a start-up near you appeared first on CERN Courier.

]]>

The core mission of CERN is fundamental research in particle physics. Yet, as a publicly funded laboratory, it also has a remit to ensure that its technology and expertise deliver prompt and tangible benefits to society wherever possible. Other physics-research laboratories and institutes were early adopters of CERN technologies, thanks to the highly collaborative nature of particle physics. Since its creation in 1954, CERN has also been active in transferring technology to industry, mainly through purchasing contracts or collaboration agreements. Through novel developments in the field of accelerator technologies and detectors, and more recently in computing and digital sciences, CERN technologies and know-how have contributed to applications in many fields, including the World Wide Web, invented at CERN by Tim Berners-Lee in 1989.

As its impact has broadened, in 1997 CERN set up a reinforced policy and team to support its knowledge- and technology-transfer activities. Twenty years later, these activities are still going strong. Some 18 start-up companies around the world are currently using CERN technology and CERN has developed a network of Business Incubation Centres (BICs) in nine different Member States. Its knowledge-transfer activities have impacted a wide range of fields, from medical and biomedical technologies to aerospace applications, safety, “industry 4.0” and the environment.

Maximising the societal impact of CERN technologies is a key aim for CERN’s knowledge-transfer activities. To do this effectively, CERN has set up a thematic forum with delegates from all of its Member States and associate Member States. Regular meetings are held at CERN, and beginning this year there will also be forum meetings dedicated to medical applications – which is one of the most prominent examples of CERN’s impact so far.

Technology for health

Early activities at CERN relating to medical applications date back to the 1970s, and have been triggered for the most part by individual initiatives. The multiwire proportional chamber conceived in 1968 by CERN physicist Georges Charpak not only opened a new era for particle physics and earned its inventor the 1992 Nobel Prize in Physics, but also found important X-ray and gamma-ray imaging applications in biology, radiology and nuclear medicine. Essential early work at CERN also contributed significantly to the development of advanced detectors and analysis techniques for positron emission tomography (PET). In particular, starting in 1975 with famous images of a mouse, CERN physicist David Townsend led important contributions to the reconstruction of PET images and to the development of 3D PET, in collaboration with the University of Geneva and the Geneva Cantonal Hospital.

After these individual efforts, in the 1990s CERN witnessed the first collaborative endeavours in medical applications. The Crystal Clear and Medipix collaborations started to explore the feasibility of developing technologies used in the LHC detectors – scintillating crystals and hybrid silicon pixel detectors, respectively – for possible medical applications, such as PET and X-ray imaging. At the same time, the Proton Ion Medical Machine Study (PIMMS) was initiated at CERN, with the aim of producing a synchrotron design optimised for treating cancer patients with protons and carbon ions. The initial design was improved by the TERA Foundation, and finally evolved into the machine built for the CNAO treatment centre in Italy, with seminal contributions from INFN. Later on, MedAustron in Austria built its treatment centre starting from the CNAO design. Beyond the initial design study, CERN contributed to the realisation of the CNAO and MedAustron treatment centres, in particular with expertise in accelerators and magnets and with training of personnel.

For the past 50 years, CERN has hosted the ISOLDE facility dedicated to the production of a large variety of radioactive ion beams for different experiments in the fields of nuclear and atomic physics, solid-state physics, materials science and life sciences. Over 1200 radioisotopes from more than 70 chemical elements have been made available for fundamental and applied research, including in the medical field. A particular highlight was the demonstration in 2012 of the efficiency of terbium-149, one of the lightest alpha emitters, for treatment at the level of single cancer cells. The growing worldwide interest in novel isotopes suitable for theragnostics, namely the possibility to perform both imaging and treatment at the same time, has motivated an extension of ISOLDE called CERN-MEDICIS (Medical Isotopes Collected from ISOLDE). This new facility will produce, as of this autumn, innovative isotopes for performing medical research at collaborating institutes (CERN Courier October 2016 p28).

Today, activities pertinent to medical applications are happening in all areas of CERN, with some compelling examples highlighted in the panel opposite. In June 2017, CERN Council approved a document setting out the “Strategy and framework applicable to knowledge transfer by CERN for the benefit of medical applications”.

Aiming high

Aerospace and particle physics might not at first seem obvious partners. However, both fields have to deal with radiation and other extreme environments, posing stringent technological requirements that are often similar. CERN operates testing facilities and develops qualification technologies for high-energy physics, which are useful for ground testing and qualification of flight equipment. This opportunity is particularly attractive for miniaturised satellites called CubeSats that typically use commercial off-the-shelf components for their electronics, since radiation qualification according to standard procedures is expensive and time-consuming. The CERN Latchup Experiment STudent sAtellite (CELESTA) intends to develop a CubeSat version of RadMon, a radiation monitor developed at CERN, and to prove that low-Earth orbit qualification can be performed in CERN’s High energy AcceleRator Mixed field facility (CHARM). CELESTA is being developed in collaboration with the University of Montpellier and this year was selected by ESA’s “Fly Your Satellite!” programme to be deployed in orbit in 2018 or 2019.

Magnesium diboride (MgB2), the high-temperature superconductor that will be used for the innovative electrical transmission lines of the high-luminosity LHC, has also demonstrated its potential for future space missions. Within the framework of the European Space Radiation Superconducting Shield (SR2S) project, which aims to demonstrate the feasibility of using superconducting magnetic shielding technology to protect astronauts from cosmic radiation, CERN successfully tested a prototype racetrack coil wound with a MgB2 superconducting tape. Astronauts’ exposure to space radiation is a major concern for future crewed missions to Mars and beyond. Monte Carlo codes such as FLUKA, initially jointly developed by CERN and INFN, and Geant4, developed and maintained by a worldwide collaboration with strong support from CERN since its conception, have been routinely used to study the radiation environment of past, recent, and future space missions. The TimePix detectors, which are USB-powered particle trackers based on the Medipix technology, are already used by NASA on board the International Space Station to accurately monitor radiation doses.

CERN’s computing expertise is also finding applications in aerospace. To solve the challenge of sharing software and codes in big-data environments, researchers at CERN have developed a system called CernVM-FS (CERN Virtual Machine File Systems), which is currently used in high-energy physics experiments to distribute about 350 million files. The system is now also being used for Euclid, a European space mission that aims to study the nature of dark matter and dark energy, to deploy software in Euclid’s nine science data centres.

CERN technologies and know-how have found concrete applications in a variety of other fields. One of them is safety: CERN’s unique working environment, which combines various types of radiation, extremely low temperatures, ultra-high magnetic fields and very high voltages, requires innovative solutions for detecting threats and preventing risks.  An example is B-rad, a portable meter to ensure radiation safety in strong magnetic fields that was initially developed by CERN’s radiation-protection group and fire brigade. With a financial contribution from the CERN Knowledge Transfer (KT) Fund, the product has been brought from lab prototype to finalised product in collaboration with an Italian company. Another example is Kryolize, a novel cryogenic safety software also supported by the CERN KT Fund. Six Kryolize licences have now been granted to other research laboratories, with potential application domains ranging from the food industry to cryogenic techniques in medicine.

CERN also taps into its technologies and creativity to address the challenge of a healthier and more sustainable planet. CERN’s contribution in this area ranges from novel biochemical sensors for water safety through novel irrigation techniques for the most challenging agricultural environments. The innovative Non Evaporable Getter (NEG) technology developed to reach ultra-high-vacuum conditions in the LHC vacuum chambers, for example, was successfully used in other applications, including thermal solar panels.

MgB2-based superconducting power cables could also offer significant power-transmission solutions for densely populated, high-load areas, and CERN is part of a consortium to build a prototype to demonstrate the feasibility of this concept.

Another buzz-worthy trend in industry is the so-called “industry 4.0”, a push towards increasing automation and efficiency in manufacturing processes with connected sensors and machines, autonomous robots and big-data technology. CERN’s accelerators, detectors and computing facilities naturally call for the use of the latest industry-4.0 technology, while the technological solutions to CERN’s own challenges can be used in the automation industry. In the field of robotics, CERN has developed TIM (Train Inspection Monorail), a mini vehicle autonomously monitoring the 27 km-long LHC tunnel and moving along tracks suspended from the tunnel’s ceiling, which can be programmed to perform real-time inspection missions. This innovation has already caught the eye of industry, in particular for autonomous monitoring of utilities infrastructure, such as underground water pipelines. Sensor technologies developed at CERN are also being used in drones, such as in the start-up Terabee, which uses them for aerial inspections and imaging services. Since their business was expanded to include CERN sensor development, the start-up won the prestigious first place in the automation category of Startup World at Automatica.

Boosting KT in practice

One of the main challenges in the knowledge-transfer sphere is to make it as easy as possible for scientists and other specialists to turn their research into innovations, and CERN invests much effort in such activities. Launched in 2011, the CERN KT Fund bridges the gap between research and industry by awarding grants to projects proposed by CERN personnel where there is high potential for positive impact on society. Since its creation, 40 projects have been funded, each receiving grants with a value of CHF15–240 thousand over a period of one or several years. Among them were projects addressing thermal management in space applications, very large-scale software distribution, distributed optical-fibre sensors and long-term data preservation for digital libraries. In 2016, two European Commission funded projects, AIDA-2020 and ARIES, incorporated a proof-of-concept fund modelled on CERN’s KT Fund.

Since the early days of technology transfer at CERN, one of the main focuses has been on knowledge transfer through people, especially early career scientists who work in industry after their contracts at CERN or who start their own company. Over the last 20 years, CERN has continued to build a general culture of entrepreneurship within the Organization through many different avenues. To assist entrepreneurs and small technology businesses in taking CERN technologies and expertise to the market, CERN has established a network of nine BICs throughout its Member States where companies can directly express their interest in adopting a CERN technology. The BIC managers provide office space, expertise, business support, access to local and national networks and support in accessing funding. There are currently 18 start-ups and spin-offs using CERN technologies in their business, with four joining BICs last year alone: Ross Robotics (exploiting software developed for production tasks at CERN); Innocryst (developing a system to identify and track gemstones); Colnec Health (using CERN’s know-how in Grid middleware technology) and Camstech (novel electrochemical sensor technologies).

Every year since 2008, students from the School of Entrepreneurship (NSE) at the Norwegian University of Science and Technology (NTNU) spend a week at CERN to evaluate the business commercial potential of CERN technologies. Three of the students attending the CERN-NTNU screening week in 2012 started the spin-off TIND, which is based on the open-source software Invenio. TIND has now, among others, contracts to host Invenio for the UNESCO International Bureau of Education, the California Institute of Technology and the Max Planck institute for Extraterrestrial Physics.

Getting the next generation of scientists into the habit of thinking about their research in terms of impact is vital for knowledge transfer to thrive. In 2015, CERN launched a series of Entrepreneurship Meet-Ups (EM-Us) to foster entrepreneurship within the CERN community. Selected CERN and external entrepreneurship experts present their expertise at informal get-togethers and the events offer a good opportunity to network. In October this year, the EM-Us are celebrating their 50th event, with over 1000 attendees since the series was created, and a new informal “KT-clinic” service has been launched.

Many more interesting projects are in the pipeline. CERN’s knowledge in superconducting technologies can be used in MRI and gantries for hadron therapy, while its skills in handling large amounts of data can benefit the health sector more widely. Detector technologies developed at CERN can be used in non-destructive testing techniques, while compact accelerators benefit the analysis of artworks. These are just some of the examples of new projects we are working on, and more initiatives will be started to meet the needs of industrial and research partners in CERN’s Member States and associate Member States for the next 20 years and beyond.

Cutting-edge medical technologies under CERN’s microscope

Novel designs for compact medical accelerators
Thanks to cutting-edge studies on beam dynamics and radio-frequency technology, along with innovative construction techniques, teams at CERN have manufactured an innovative linear accelerator designed to be compact, modular, low-cost and suitable for medical applications. The accelerator is a radio-frequency quadrupole (RFQ) operating at a frequency of 750 MHz, which had never been achieved before, and capable of producing low-intensity beams of just a few microamps with no significant losses. The high-frequency RFQ capitalises on the skills and know-how developed at CERN while designing Linac 4, and is a perfect injector for the new generation of high-frequency compact linear accelerators being developed for hadron therapy.Expertise in high-gradient accelerating structures gathered by the Compact Linear Collider (CLIC) group at CERN is also being applied to novel designs for hadron-therapy facilities, such as the cyclinac concept proposed by the TERA foundation, as well as the development of accelerators to boost the energy of medical cyclotrons to provide proton-imaging capabilities.

CERN’s know-how in cryogenic systems is also interesting for modern superconducting medical accelerators, such as the compact cyclotron being developed by CIEMAT for on-site production in hospitals of isotopes for PET.

Detectors and medical imaging
Medipix3 is a CMOS pixel detector read-out chip designed to be connected to a segmented semiconductor sensor. Like its predecessor, Medipix2, it acts as a camera taking images based on the number of particles that hit the pixels when the electronic shutter is open. However, Medipix3 aims to go much further than Medipix2 by permitting colour imaging and dead-time free operation. Ten years ago, a member of the Medipix3 collaboration founded a company in New Zealand and obtained a licence to exploit the chip for spectral computed tomography imaging – X-Ray imaging in colour. The company’s pre-clinical scanners enable researchers and clinicians to study biochemical and physiological processes in specimens and small animals. In a related development, the Timepix3 chip permits trigger-free particle tracking in a single semiconductor layer. Preliminary measurements using the previous generation of the chip point strongly to its potential for beam and dose monitoring in the hadron-therapy environment.

Since 1997, CERN’s Crystal Clear collaboration (CCC) has been using its expertise in scintillators to develop and construct PET prototypes. Their first success was the ClearPET concept: the development of several prototypes has resulted in the commercialisation of a small-animal scanner with breakthrough performance and led to the first simultaneous PET/CT image of a mouse in 2015. Starting in 2002, the CCC started developing dedicated PET scanners for breast imaging, called ClearPEM, with two prototypes undergoing clinical trials. Recent CCC developments are focused on time-of-flight PET scanners for better image quality. Via the European FP7 project EndOTOFPET_US, CCC members are developing a novel bi-modal time-of-flight PET and an ultrasound endoscope prototype dedicated to early-stage detection of pancreatic and prostate cancer.

Computing and simulations
Simulation codes initially developed for HEP, such as Geant4 and FLUKA, have also become crucial to modelling the effects of radiation on biological tissues for a variety of applications in the medical field. FLUKA is licensed to various companies in the medical field: in particular, FLUKA-based physics databases are at the core of the commercial treatment-planning systems (TPS) clinically used at HIT and CNAO, as well as of the TPS for carbon ions for MedAustron. Geant4 is adopted by thousands of users worldwide for applications in a variety of domains: examples of Geant4 extensions for use in the medical field are GATE, TOPAS and Geant4-DNA.

Computing tools, infrastructures and services developed for HEP have also great potential for applications in the medical field. CERN openlab has recently started two collaborative projects in this domain: BioDynaMo aims to design and build a cloud-based computing platform for rapid simulation of biological tissue dynamics, such as brain development; GeneROOT aims to use ROOT to analyse large genomics data sets, beginning with data from TwinsUK, the largest UK adult twins registry.

 

A brief history of knowledge transfer at CERN

• 1954: Since its creation, CERN has been active in knowledge and technology transfer, although with no formal structure.

• 1974 & 1983–1984: Two external studies consider the economic impact of CERN contracts and find that it equates to around 3–3.5 times the contract value.

• 1987: CERN’s Annual Report incorporates the first dedicated section on technology-transfer activities.

• 1988: The Industry and Technology Liaison Office (ITLO) is founded at CERN to stimulate interaction with industry, including through procurement.

• June 1997: With the support of Council, CERN sets up a reinforced structure for technology transfer.

• November 1997: The Basic Science and Technology Transfer: Means and Methods in the CERN Environment workshop helps to identify appropriate technology-transfer mechanisms.

• 1998: CERN develops a series of reports on intellectual-property-rights protection practices that was endorsed by the finance committee.

• 1999: First technology-transfer policy at CERN, with the new technology-transfer service replacing the ITLO and three main actions: to encourage the protection of intellectual-property rights for new technologies developed at CERN and the institutes participating in its scientific programme; to promote the training of young scientists in intellectual-property rights; and to promote entrepreneurship.

• 2001: CERN begins to report on its technology-transfer activities annually to the CERN finance committee.

• 2008: creation of HEPTECH, a technology-transfer network for high-energy physics.

• 2010: CERN develops a new policy on the management of intellectual property in technology-transfer activities at CERN.

• 2014: OECD publishes a report entitled “The Impacts of Large Research Infrastructures on Economic Innovation and on Society: Case studies at CERN”, which praises innovation at CERN.

• 2016: CERN featured as leader in World Intellectual Property Organisation (WIPO) Global Innovation Index.

• 2017: CERN publishes new medical-applications strategy, works on a set of updated software and spin-off and patent policies, and launches a revamped knowledge-transfer website: kt.cern.

 

The post From the web to a start-up near you appeared first on CERN Courier.

]]>
Feature https://cerncourier.com/wp-content/uploads/2018/06/CCkt1_08_17.jpg
CERN’s recipe for knowledge transfer https://cerncourier.com/a/viewpoint-cerns-recipe-for-knowledge-transfer/ Fri, 22 Sep 2017 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-cerns-recipe-for-knowledge-transfer/ We want to build a culture of entrepreneurship whereby more people leaving CERN consider starting a business, says Knowledge Transfer head Giovanni Anelli.

The post CERN’s recipe for knowledge transfer appeared first on CERN Courier.

]]>

Understanding what the universe is made of and how it started are the fundamental questions behind CERN’s existence. This quest alone makes CERN a unique knowledge-focused organisation and an incredible human feat. To achieve its core mission, CERN naturally creates new opportunities for innovation. A myriad of engineers, technicians and scientists develop novel technology and know-how that can be transferred to industry for the benefit of society. Twenty years ago, with the support of CERN Council, a reinforced structure for knowledge and technology transfer was established to strengthen these activities.

Advances in fields including accelerators, detectors and computing have had a positive impact outside of CERN. Although fundamental physics might not seem the most obvious discipline in which to find technologies with marketable applications, the many examples of applications of CERN’s technology and know-how – whether in medical technology, aerospace, safety, the environment and “industry 4.0” – constitute concrete evidence that high-energy physics is a fertile ground for innovation. That CERN’s expertise finds applications in multinational companies, small and medium enterprises and start-ups alike is further proof of CERN’s broader impact (see “From the web to a start-up near you”)

As an international organisation, CERN has access to a wealth of diverse viewpoints, skills and expertise. But what makes CERN different from other organisations in other fields of research? Sociologists have long studied the structure of scientific organisations, several using CERN as a basis, and they find that particle-physics collaborations uniquely engage in “participatory collaboration” that brings added value in knowledge generation, technology development and innovation. This type of collaboration, along with the global nature of the experiments hosted by CERN, adds high value to the laboratory’s knowledge-transfer activities.

Despite its achievements in knowledge transfer internationally, CERN is seldom listed in international innovation rankings; when it is present, it is never at the top. This is mainly a selection effect due to methodology. For example, the Reuters “Top 25 Global Innovators – Government” ranking relies on patents as a proxy for innovation (of the 10 innovation criteria used, seven are based on patents). CERN’s strategy is to focus on open innovation and to maximise the dissemination of our technologies and know-how, rather than focus on revenue. Although there is a wide range of intellectual-property tools useful for knowledge transfer, patent volume is not a relevant measure of successful intellectual-property management at CERN.

Instead, the CERN Knowledge Transfer group measures the number of new technology disclosures (91 in 2016), and the number of contracts and agreements signed with external partners and industry (42 in 2016, and totalling 251 since 2011). We also monitor spin-off and start-up companies – there are currently 18 using CERN technology, some of which are hosted directly in CERN’s network of business incubation centres. Together with the impressive breadth of application fields of CERN technologies, we believe these are clearer measures of impact

In the future, CERN will continue to pursue and promote open innovation. We want to build a culture of entrepreneurship whereby more people leaving CERN consider starting a business based on CERN technologies, and use a wide range of metrics to quantify our innovation. Strong links with industry are important to help reinforce a market-pull rather than technology-push approach. The Knowledge Transfer group will also continue to provide a service to the CERN community through advice, support, training, networks and infrastructure for those who wish to engage with industry through our activities.   

Human capital is vital in our equation, since knowledge transfer cannot happen without CERN’s engineers, technicians and physicists. Our role is to facilitate their participation, which could start with a visit to our new website, an Entrepreneurship Meet-Up (EM-U), or a visit to one of our seminars. Since they were launched roughly two years ago, EM-Us and knowledge-transfer seminars have together attracted more than 2000 people. Whether you want to tell us about an idea you have, or are curious about the impact of our technologies on society, we hope to hear from you soon.

• Find out more at kt.cern.

The post CERN’s recipe for knowledge transfer appeared first on CERN Courier.

]]>
Opinion We want to build a culture of entrepreneurship whereby more people leaving CERN consider starting a business, says Knowledge Transfer head Giovanni Anelli. https://cerncourier.com/wp-content/uploads/2018/06/CCvie1_08_17.jpg
Get on board with EASITrain https://cerncourier.com/a/get-on-board-with-easitrain/ Fri, 11 Aug 2017 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/get-on-board-with-easitrain/ Heike Kamerlingh Onnes won his Nobel prize back in 1913 two years after the discovery of superconductivity; Georg Bednorz and Alexander Müller won theirs in 1987, just a year after discovering high-temperature superconductors. Putting these major discoveries into use, however, has been a lengthy affair, and it is only in the past 30 years or […]

The post Get on board with EASITrain appeared first on CERN Courier.

]]>

Heike Kamerlingh Onnes won his Nobel prize back in 1913 two years after the discovery of superconductivity; Georg Bednorz and Alexander Müller won theirs in 1987, just a year after discovering high-temperature superconductors. Putting these major discoveries into use, however, has been a lengthy affair, and it is only in the past 30 years or so that demand has emerged. Today, superconductors represent an annual market of around $1.5 billion, with a high growth rate, yet a plethora of opportunities remains untapped.

Developing new superconducting materials is essential for a possible successor to the LHC currently being explored by the Future Circular Collider (FCC) study, which is driving a considerable effort to improve the performance and feasibility of large-scale magnet production. Beyond fundamental research, superconducting materials are the natural choice for any application where strong magnetic fields are needed. They are used in applications as diverse as magnetic resonance imaging (MRI), the magnetic separation of minerals in the mining industry and efficient power transmission across long distances (currently being explored by the LIPA project in the US and AmpaCity in Germany).

The promise for future technologies is even greater, and overcoming our limited understanding of the fundamental principles of superconductivity and enabling large-quantity production of high-quality conductors at affordable prices will open new business opportunities. To help bring this future closer, CERN has initiated the European Advanced Superconductivity Innovation and Training project (EASITrain) to prepare the next generation of researchers, develop innovative materials and improve large-scale cryogenics (easitrain.web.cern.ch). From January next year, 15 early stage researchers will work on the project for three years, with the CERN-coordinated FCC study providing the necessary research infrastructure.

Global network

EASITrain establishes a global network of research institutes and industrial partners, transferring the latest knowledge while also equipping participants with business skills. The network will join forces with other EU projects such as ARIES, EUROTAPES (superconductors), INNWIND (a 10–20 MW wind turbine), EcoSWING (superconducting wind generator), S-PULSE (superconducting electronics) and FuSuMaTech (a working group approved in June devoted to the high-impact potential of R&D for the HL-LHC and FCC), and aims to profit from the well-established Test Infrastructure and Accelerator Research Area Preparatory Phase (TIARA) platform. EASITrain also links with the Marie Curie training networks STREAM and RADSAGA, both hosted by CERN.

Operating within the EU’s H2020 framework, one of EASITrain’s targets is energy sustainability. Performance and efficiency increases in the production and operation of superconductors could lead to 10–20 MW wind turbines, for example, while new efficient cryogenics could reduce the carbon footprint of industries, gas production and transport. EASITrain will also explore the use of novel superconductors, including high-temperature superconductors, in advanced materials for power-grid and medical applications, and bring together technical experts, industrial representatives and specialists in business and marketing to identify new superconductor applications. Following an extensive study, three specific application areas have been identified: uninterruptible power supplies; sorting machines for the fruit industry; and large loudspeaker systems. These will be further explored during a three-day “superconductivity hackathon” satellite event at EUCAS17, organised jointly with CERN’s KT group, IdeaSquare, WU Vienna and the Fraunhofer Institute.

Together with the impact that superconductors have had on fundamental research, these examples show the unexpected transformative potential of these still mysterious materials and emphasise the importance of preparing the next generation for the challenges ahead.

Hackathon application destinations

Uninterruptible power supply (UPS). UPS systems are energy-storage technologies that can take on and deliver power when necessary. Cloud-based applications are leading to soaring data volumes and an increasing need for secure storage, driving growth among large data centres and a shift towards more efficient UPS solutions that are expected to carve a slice of an almost $1 billion and growing market. Current versions are based on batteries with a maximum efficiency of 90%, but superconductor-based implementations based on flywheels will ensure a continuous and longer-lived power supply, minimising data loss and maximising server stability.

Sorting machines for the fruit industry. Tonnes of fruit have to be disposed of worldwide because current technologies based on spectroscopy are not able to determine the maturity level of fruit sufficiently accurately, with techniques also offering limited information about small-sized fruit. Superconductors would enable NMR-based scanning systems that allow producers to accurately and non-destructively determine valuable properties such as ripeness, absence of seeds and, crucially, the maturity of fruit. In 2016, sorting-machine manufacturers made profits of $360 million selling products analysing apples, pears and citrus fruit, and the market has experienced a growth of about 20% per year.

Large loudspeaker systems. The sound quality of powerful loudspeakers, particularly PA systems for music festivals and stadiums, could enter new dimensions by using superconductors. Higher electrical resistance leads to poorer sound quality, since speakers need to modify the strength of a magnetic field rapidly to adapt to different frequency ranges. Superconductivity also allows smaller magnets to be used, making them more compact and transportable. A major concern among European manufacturers has been the search for the next big step in loudspeaker evolution, to defend against competition from Asia, and the size and quality of large speakers is now a major driver of the $500 million industry.

The post Get on board with EASITrain appeared first on CERN Courier.

]]>
Feature https://cerncourier.com/wp-content/uploads/2018/06/CCeas1_07_17-1.jpg
Europe enters the extreme X-ray era https://cerncourier.com/a/europe-enters-the-extreme-x-ray-era/ Mon, 10 Jul 2017 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/europe-enters-the-extreme-x-ray-era/ The European X-ray Free-Electron Laser will probe electronic, chemical and biological processes in unprecedented detail.

The post Europe enters the extreme X-ray era appeared first on CERN Courier.

]]>

The past few decades have witnessed an explosion in X-ray sources and techniques, impacting science and technology significantly. Large synchrotron X-ray facilities around the world based on advanced storage rings and X-ray optics are used daily by thousands of scientists across numerous disciplines. From the shelf life of washing detergents to the efficiency of fuel-injection systems, and from the latest pharmaceuticals to the chemical composition of archaeological remains, highly focused and brilliant beams of X-rays allow researchers to characterise materials over an enormous range of length and timescales, and therefore link the microscopic behaviour of a system with its bulk properties.

So-called third-generation light sources based on synchrotrons produce stable beams of X-rays over a wide range of photon energies and beam parameters. The availability of more intense, shorter and more coherent X-ray pulses opens even further scientific opportunities, such as making high-resolution movies of chemical reactions or providing industry with real-time nanoscale imaging of working devices. This boils down to maximising a parameter called peak brilliance. While accelerator physicists have made enormous strides in increasing the peak brilliance of synchrotrons, this quantity experienced a leap forward by many orders of magnitude when the first free-electron lasers (FELs) started operating in the X-ray range more than a decade ago.

FLASH, the soft-X-ray FEL at DESY in Hamburg, was inaugurated in 2005 and marked the beginning of this new epoch in X-ray science. Based on superconducting accelerating structures developed initially for a linear collider for particle physics (see “The world’s longest superconducting linac”), it provided flashes of VUV radiation with peak brilliances almost 10 orders of magnitude higher than any storage-ring-based source in the same wavelength range. The unprecedented peak power of the beam immediately led to groundbreaking new research in physics, chemistry and biology. But importantly, FLASH also demonstrated that the amplification scheme responsible for the huge gain of FELs – Self Amplified Spontaneous Emission (SASE) – was feasible at short wavelengths and could likely be extended to the hard-X-ray regime.

The first hard-X-ray FEL to enter operation based on the SASE principle was the Linac Coherent Light Source (LCLS) at SLAC National Accelerator Laboratory in California, which obtained first light in 2009 using a modified version of the old SLAC linac and operates at X-ray energies up to around 11 keV. Since then, several facilities have been inaugurated or are close to start-up: SACLA in Japan, Pohang FEL in South Korea, and Swiss-FEL in Switzerland. The European X-ray Free-Electron Laser (European XFEL) in Schenefeld-Hamburg, Germany, marks a further step-change in X-ray science, promising to produce the brightest beams with the highest photon energies and the highest repetition rates. Construction of the €1.2 billion facility began in January 2009 funded by 11 countries: Denmark, France, Germany, Hungary, Italy, Poland, Russia, Slovakia, Spain, Sweden and Switzerland, with Germany (58%) and Russia (27%) as the largest contributors. It is expected that the UK will join the European XFEL in 2017.

The European XFEL extends over a distance of 3.4 km in underground tunnels (figure 1). It begins with the electron injector at DESY in Bahrenfeld-Hamburg, which produces and injects electrons into a 2 km-long superconducting linear accelerator where the desired electron energy (up to 17.5 GeV) is achieved. Exiting the linac, electrons are then rapidly deflected in an undulating left–right pattern by traversing a periodic array of magnets called an undulator (figure 1, bottom right), causing the electrons to emit intense beams of X-ray photons. X-rays emerging from the undulator, via 1 km-long photon-transport tunnels equipped with various X-ray optics elements, finally arrive at the European XFEL headquarters in Schenefeld where the experiments will take place.

In addition to the development of the electron linac, which was commissioned earlier this year and involved a major effort by DESY in collaboration with numerous other accelerator facilities over the past decade (see “The world’s longest superconducting linac”), the European XFEL has driven the development of both undulator technology and advanced X-ray optics. This multinational and multidisciplinary effort now opens perspectives for novel scientific experiments. When fully commissioned, towards the end of 2018, the facility will deliver 4000 hours of accelerator time per year for user experiments that are approved via external peer review.

Manipulating X-rays

Synchrotron radiation was first detected experimentally at Cornell in 1947, and the first generation of synchrotron-radiation users were termed “parasitic” because they made use of X-rays produced as a byproduct of particle-physics experiments. Dedicated “second-generation” X-ray sources were established in the early 1970s, while much more brilliant “third-generation” sources based on devices called undulators started to appear in the early 1990s (figure 2). The SASE technology underpinning XFELs, which followed from work undertaken in the mid-1960s, ensures that the produced X-rays are much more intense and more coherent that those emitted by storage rings (see SASE panel below). Like the light coming from an optical laser, the X-rays generated by SASE are almost 100% transversely coherent compared to less than one per cent for third-generation synchrotrons, indicating that the radiation is an almost perfect plane wave. Even though the longitudinal-coherence length is not comparable to that of a single-mode optical laser, the use of the term “X-ray laser” is clearly justified for facilities such as the European XFEL.

A major challenge with X-ray lasers is to develop the mirrors, monochromators and other optical components that enable high-energy X-rays to be manipulated and their coherence to be preserved. Compared with the visible light emerging from a standard red helium-neon laser, which has a wavelength of 632 nm, the typical wavelength of hard X-rays is around 0.1 nm. Consequently, X-ray laser light is up to 6000 times more sensitive to distortions in the optics. On the other hand, X-ray mirrors work at extremely small grazing incidence angles (typically around 0.1° for hard X-rays at the European XFEL) because the interaction between X-rays and matter is so weak. This reduces the sensitivity to profile distortions and makes errors of up to 2 nm tolerable on a 1 m-long X-ray mirror, before the reflected X-ray wavefront becomes noticeably affected. Still, these requirements on profile errors are extremely high – about 10 times more stringent than for the Hubble Space Telescope mirror, for example.

The technology to produce these ultra-flat X-ray mirrors was only developed in recent years in Japan and Europe. It is based on a process called deterministic polishing, in which material is removed atomic layer by atomic layer according to a very precisely measured map of the initial profile’s deviations from an ideal shape. After years of development and many months of deterministic polishing iterations, the first 95 cm-long silicon X-ray mirror fulfilling the tight specifications of the European XFEL was completed in March 2016, with 10 more mirrors of similar quality following shortly thereafter. In the final configuration, 27 of these extremely precise mirrors will be used to steer the X-ray laser beam along the photon-transport tunnels to all the scientific instruments.

Managing the large heat loads on the European XFEL mirrors is a major challenge. To remove the heat generated by the X-ray laser beam without distorting the highly sensitive mirrors, a liquid-metal film is used to couple the mirror to a water-cooling system in a tension- and vibration-free fashion. Another mirror system will be cooled to a temperature of around 100 K, at which the thermal-expansion coefficient of silicon is close to zero. This solution, which is vital to deal with the high repetition rate of the European XFEL, is often employed for smaller silicon crystals acting as crystal monochromators but is rarely necessary for large mirror bodies where the grazing-incidence geometry spreads the heat over a large area.

Indeed, the SASE pulses have potentially devastating power – especially close to the sample where the beam may be focused to small dimensions. A typical SASE X-ray pulse of 100 fs duration contains about 2 mJ of thermal X-ray energy (corresponding to 1012 photons at 12 keV photon energy), which means that a copper beam-stop placed close behind the sample would be heated to a temperature of several 100,000 °C and could therefore be evaporated (along with the sample) from just one pulse. While this is not necessarily a problem for samples that can be replaced via advanced injection schemes and where data can be collected before destruction takes place, it could shorten the lifetime of slits, attenuators, windows and other standard beamline components. The solution is to intersect the beam only where it has a larger size and to use only light elements that absorb less X-ray energy per atom. Still, stopping the X-ray laser beam remains a challenge at the European XFEL, with up to 2700 pulses in a 600 μs pulse train (figure 3). Indeed, the entire layout of the photon-distribution system was adapted to counteract this damaging effect of the X-ray laser beam, and a facility-wide machine-protection system limits the pulse-train length to a safe limit, depending on the optical configuration. Since a misguided X-ray laser beam can quickly drill through the stainless-steel pipes of the vacuum system, diamond plates are positioned around the beam trajectory and will light up if hit by X-rays, triggering a dump of the electron beam.

The business end of things

At the European XFEL, the generation of X-ray beams is largely “behind the scenes”. The scientific interest in XFEL experiments stems from the ability to deliver around 1012 X-ray photons in one ultrafast pulse (with a duration in the range 10–100 fs) and with a high degree of coherence. Performing experiments within such short pulses allows users to generate ultrafast snapshots of dynamics that would be smeared out with longer exposure times and give rise to diffuse scattering. Combined with spectroscopic information, a complete picture of atomic motion and molecular rearrangements, as well as the charge and spin states and their dynamics, can be built up. This leads to the notion of a “molecular movie”, in which the dynamics are triggered by an external optical laser excitation (acting as an optical pump) and the response of a molecule is monitored by ultrafast X-ray scattering and spectroscopy (X-ray probe). Pump-probe experiments are typically ensemble-averaged measurements of many molecules that are randomly aligned with respect to each other and not distinguishable within the scattering volume. The power and coherence of the European XFEL beams will allow such investigations with unprecedented resolution in time and space compared to today’s best synchrotrons.

In particular, the coherence of the European XFEL beam allows users to distinguish features beyond those arising from average properties. These features are encoded in the scattering images as grainy regions of varying intensity called speckle, which results from the self-interference of the scattered beam and can be exploited to obtain higher spatial resolution than is possible in “incoherent” X-ray scattering experiments (figure 4). Since the speckles reflect the exact real-space arrangement of the scattering volume, even subtle structural changes can alter the speckle pattern dramatically due to interference effects.

The combination of ultrafast pulses, huge peak intensity and a high degree of beam coherence is truly unique to FEL facilities and has already enabled experiments that otherwise were impossible. In addition, the European XFEL has a huge average intensity due to the many pulses delivered each second. This allows a larger number of experimental sessions per operation cycle and/or better signal-to-noise ratios within a given experimental time frame. The destructive power of the beam means that many experiments will be of the single-shot type, which requires a continuous injection scheme because the sample cannot be reused. Other experiments will operate with reduced peak flux, allowing multi-exposure schemes as also demonstrated in work at LCLS and FLASH.

Six experimental stations are planned for the European XFEL start-up, two per SASE beamline. The first, situated at the hard-X-ray undulator SASE-1, is devoted to the study of single-particles and biomolecules, serial femtosecond crystallography, and femtosecond X-ray experiments in biology and chemistry. SASE-2 caters to dynamics investigations in condensed-matter physics and material-science experiments, specialising in extreme states of matter and plasmas. At the soft-X-ray branch SASE-3, two instruments will allow investigations of electronic states of matter and atomic/cluster physics, among other studies. The three SASE undulators will deliver photons in parallel and the instruments will share their respective beams in 12 hour shifts, so that three instruments are always operating at any given time.

Eight years after the project officially began, the European XFEL finally achieved first light in 2017 and its commissioning is progressing according to schedule. The facility is the culmination of a worldwide effort lead by DESY concerning the electron linac and by European XFEL Gmbh for the development of X-ray photon transport and experimental stations. The facility is conveniently situated among other European light sources – synchrotrons that are also continuously evolving towards larger brilliance – and a handful of hard-X-ray FELs worldwide. The European XFEL is by far the most powerful hard-X-ray source in the world and will remain at the forefront for at least the next 20–30 years. Continuous investment in instrumentation and detectors will be required to capitalise fully on the impressive specifications, and the facility has the potential to construct about six additional instruments and possibly even a second experimental hall, all fed by X-rays generated by the existing superconducting electron linac. Without a doubt, Europe has now entered the extreme X-ray era.

The principle of SASE

Self-Amplified Spontaneous Emission (SASE), the underlying principle of X-ray free-electron lasers, is based on the interaction between a relativistic electron beam and the radiation emitted by the electrons as they are accelerated through a long alternating magnetic undulator array (see image). If the undulator is short, on the order of a few metres, and the undulating path is well defined with a small amplitude, the radiation emitted by one electron adds up coherently at one particular wavelength as it travels through the undulator. Hence, the intensity is proportional to N2p, where Np is the number of undulator periods (typically around 100). This is the regular undulator radiation generated at third-generation synchrotron sources such as the ESRF in France or APS in the US, and also at the next generation of diffraction-limited storage rings, such as MAX IV in Sweden. On the other hand, if the undulator is very long, the interactions between the electrons and the radiation field that builds up will eventually lead to micro-bunching of the electron beam into coherent packages that radiate in phase (see image). This results in a huge amplification (lasing) of emitted intensity as it becomes proportional to N2e, where Ne is the number of electrons emitting in phase within the co-operation length (typically 106, or more). The hard-X-ray undulators of the European XFEL have magnetic lengths of 175 m in order to ensure that SASE works over a wide range of photon energies and electron-beam parameters. High electron energy, small energy spread and a small emittance (the product of beam size and divergence) are crucial for SASE to work in the X-ray range. Together with the requirement of very long undulators, it favours the use of linac sources, instead of storage rings, for X-ray lasers.

The post Europe enters the extreme X-ray era appeared first on CERN Courier.

]]>
Feature The European X-ray Free-Electron Laser will probe electronic, chemical and biological processes in unprecedented detail. https://cerncourier.com/wp-content/uploads/2018/06/CCxfe1_06_17.jpg
Discovering diamonds https://cerncourier.com/a/discovering-diamonds/ Mon, 10 Jul 2017 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/discovering-diamonds/ Natural diamonds are old, almost as old as the planet itself. They mostly originated in the Earth’s mantle around 1 to 3.5 billion years ago and typically were brought to the surface during deep and violent volcanic eruptions some tens of millions of years ago. Diamonds have been sought after for millennia and still hold […]

The post Discovering diamonds appeared first on CERN Courier.

]]>

Natural diamonds are old, almost as old as the planet itself. They mostly originated in the Earth’s mantle around 1 to 3.5 billion years ago and typically were brought to the surface during deep and violent volcanic eruptions some tens of millions of years ago. Diamonds have been sought after for millennia and still hold status. They are also one of our best windows into our planet’s dynamics and can, in what is essentially a galactic narrative, convey a rich story of planetary science. Each diamond is unique in its chemical and crystallographic detail, with micro-inclusions and impurities within them having been protected over vast timescales.

Diamonds are usually found in or near the volcanic pipe that brought them to the surface. It was at one of these, in 1871 near Kimberley, South Africa, where the diamond rush first began – and where the mineral that hosts most diamonds got its name: kimberlite. Many diamond sources have since been discovered and there are now more than 6000 known kimberlite pipes (figure 1 overleaf). However, with current mining extraction technology, which generally involves breaking up raw kimberlite to see what’s inside, diamonds are often damaged and are steadily becoming mined out. Today, a diamond mine typically lasts for a few decades, and it costs around $10–26 to process each tonne of rock. With the number of new, economically viable diamond sources declining – combined with high rates of diamonds being extracted, ageing mines and increasing costs – most forecasts predict a decline in rough diamond production compared to demand, starting as soon as 2020.

A new diamond-discovery technology called MinPET (mineral positron emission tomography) could help to ensure that precious sources of natural diamonds last for much longer. Inspired by the same principles adapted in modern, high-rate, high-granularity detectors commonly found in high-energy physics experiments, MinPET uses a high-energy photon beam and PET imaging to scan mined kimberlite for large diamonds, before the rocks are smashed to pieces.   

From eagle eyes to camera vision

Over millennia, humans have invented numerous ways to look for diamonds. Early techniques to recover loose diamonds used the principle that diamonds are hydrophobic, so resist water but stick readily to grease or fat. Some stories even tell of eagles recovering diamonds from deep, inaccessible valleys, when fatty meat thrown onto a valley floor might stick to a gem: a bird would fly down, devour the meat, and return to its nest, where the diamond could be recovered from its droppings. Today, technology hasn’t evolved much. Grease tables are still used to sort diamond from rock, and the current most popular technique for recovering diamonds (a process called dense media separation) relies on the principle that kimberlite particles float in a special slurry while diamonds sink. The excessive processing required with these older technologies wastes water, takes up huge amounts of land, releases dust into the surrounding atmosphere, and also leads to severe diamond breakage.    

Just 1% of the world’s diamond sources have economically viable grades of diamond and are worth mining. At most sites the gemstones are hidden within the kimberlite, so diamond-recovery techniques must first crush each rock into gravel. The more barren rock there is compared to diamonds, the more sorting has to be done. This varies from mine to mine, but typically is under one carat per tonne – more dilute than gold ores. Global production was around 127 million carats in 2015, meaning that mines are wasting millions of dollars crushing and processing about 100 million tonnes of kimberlite per year that contains no diamonds. We therefore have an extreme case of a very high value particle within a large amount of worthless material – making it an excellent candidate for sensor-based sorting.

Early forms of sensor-based sorting, which have only been in use since 2010, use a technique called X-ray stimulated optical fluorescence, which essentially targets the micro impurities and imperfections in each diamond (figure 2). Using this method, the mined rocks are dropped during the extraction process at the plant, and the curtain of falling rock is illuminated by X-rays, allowing a proportion of liberated or exposed diamonds to fluoresce and then be automatically extracted. The transparency of diamond makes this approach quite effective. When Petra Diamonds Ltd introduced this technique with several X-ray sorting machines costing around $6 million, the apparatus paid for itself in just a few months when the firm recovered four large diamonds worth around $43 million. These diamonds, presumed to be fragments of a larger single one, were 508, 168, 58 and 53 carats, in comparison to the average one-carat engagement ring.

Very pure diamonds that do not fluoresce, and gems completely surrounded by rock, can remain hidden to these sensors. As such, a newer sensor-based sorting technique that uses an enhanced form of dual-energy X-ray transmission (XRT), similar to the technology for screening baggage in airports, has been invented to get around this problem. It can recover liberated diamonds down to 5 mm diameter, where 1 mm is usually the smallest size recovered commercially, and, unlike the fluorescing technique, can detect some locked diamonds. These two techniques have brought the benefits of sensor-based sorting into sharp focus for more efficient, greener mines and for reducing breakage.

Recent innovations in particle-accelerator and particle-detector technology, in conjunction with high-throughput electronics, image-processing algorithms and high-performance computing, have greatly enhanced the economic viability of a new diamond-sensing technology using PET imaging. PET, which has strongly benefitted from many innovations in detector development at CERN, such as BGO scintillating crystals for the LEP experiments, has traditionally been used to observe processes inside the body. A patient must first absorb a small amount of a positron-emitting isotope; the ensuing annihilations produce patterns of gamma rays that can be reconstructed to build a 3D picture of metabolic activity. Since a rock cannot be injected with such a tracer, MinPET requires us to irradiate rocks with a high-energy photon beam and generate the positron emitter via transmutation.

The birth of MinPET

The idea to apply PET imaging to mining began in 1988, in Johannesburg, South Africa, where our small research group of physicists used PET emitters and positron spectroscopy to study the crystal lattice of diamonds. We learnt of the need for intelligent sensor-based sorting from colleagues in the diamond mining industry and naturally began discussing how to create an integrated positron-emitting source.

Advances in PET imaging over the next two decades led to increased interest from industry, and in 2007 MinPET achieved its first major success in an experiment at Karolinska hospital in Stockholm, Sweden. With a kimberlite rock playing the role of a patient, irradiation was performed at the hospital’s photon-based cancer therapy facility and the kimberlite was then imaged at the small-animal PET facility in the same hospital. The images clearly revealed the diamond within, with PET imaging of diamond in kimberlite reaching an activity contrast of more than 50 (figure 3). This result led to a working technology demonstrator involving a conveyor belt that presented phantoms (rocks doped with a sodium PET-emitter were used to represent the kimberlite, some of which contained a sodium hotspot to represent a hidden diamond) to a PET camera. These promising results attracted funding, staff and students, enabling the team to develop a MinPET research laboratory at iThemba LABS in Johannesburg. The work also provided an important early contribution to South Africa’s involvement in the ATLAS experiment at CERN’s Large Hadron Collider.

By 2015 the technology was ready to move out of the lab and into a diamond mine. The MinPET process (figure 4) involves using a high-energy photon beam of some tens of MeV to irradiate a kimberlite rock stream, turning some of the light stable isotopes within the kimberlite into transient positron emitters, or PET isotopes, which can be imaged in a similar way to PET imaging for medical diagnostics. The rock stream is buffered for a period of 20 minutes before imaging the rock, because by then carbon is the dominant PET isotope. Since non-diamond sources of carbon have a much lower carbon concentration than diamond, or are diluted and finely dispersed within the kimberlite, diamonds show up on the image as a carbon-concentration hotspot.

The speed of imaging is crucial to the viability of MinPET. The detector system must process up to 1000 tonnes of rock per hour to meet the rate of commercial rock processing, with PET images acquired in just two seconds and image processing taking just five seconds. This is far in excess of medical-imaging needs and required the development of a very high-rate PET camera, which was optimised, designed and manufactured in a joint collaboration between the present authors and a nuclear electronic technology start-up called NeT Instruments. MinPET must also take into account rate capacity, granularity, power consumption, thermal footprints and improvements in photon detectors. The technology demonstrator is therefore still used to continually improve MinPET’s performance, from the camera to raw data event building and fast-imaging algorithms.

An important consideration when dealing with PET technology is that radiation remains within safe limits. If diamonds are exposed to extremely high doses of radiation, their colour can change – something that can be done deliberately to alter the gems, but which reduces customer confidence in a gem’s history. Despite being irradiated, the dose exposure to the diamonds during the MinPET activation process is well below the level it would receive from nature’s own background. It has turned out, quite amazingly, that MinPET offers a uniquely radiologically clean scenario. The carbon PET activity and a small amount of sodium activity are the only significant activations, and these have relatively short half-lives of 20 minutes and 15 hours, respectively. The irradiated kimberlite stream soon becomes indistinguishable from non-irradiated kimberlite, and therefore has a low activity and allows normal mine operation.

Currently, XRT imaging techniques require each particle of kimberlite rock being processed to be isolated and smaller than 75 mm; within this stream only liberated diamonds that are at least 5 mm wide can be detected and XRT can only provide 2D images. MinPET is far more efficient because it is currently able to image locked diamonds with a width of 4 mm within a 100 mm particle of rock, with full 3D imaging. The size of diamonds MinPET detects means it is currently ideally suited for mines that make their revenue predominantly from large diamonds (in some mines breakage is thought to cause up to a 50% drop in revenue). There is no upper limit for finding a liberated diamond particle using MinPET, and it is expected that larger diamonds could be detected in up to 160 mm-diameter kimberlite particles.

To crumble or shine

MinPET has now evolved from a small-scale university experiment to a novel commercial technology, and negotiations with a major financial partner are currently at an advanced stage. Discussions are also under way with several accelerator manufacturers to produce a 40 MeV beam of electrons with a power of 40–200 kW, which is needed to produce the original photon beam that kick-starts the MinPET detection system.

Although the MinPET detection system costs slightly more than other sorting techniques, overall expenditure is less because processing costs are reduced. Envisaged MinPET improvements over the next year are expected to take the lower limit of discovery down to as little as 1.5 mm for locked diamonds. The ability to reveal entire diamonds in 3D, and locating them before the rocks are crushed, means that MinPET also eliminates much of the breakage and damage that occurs to large diamonds. The technique also requires less plant, energy and water – all without causing any impact on normal mine activity.

The world’s diamond mines are increasingly required to be greener and more efficient. But the industry is also under pressure to become safer, and the ethics of mining operations are a growing concern among consumers. In a world increasingly favouring transparency and disclosure, the future of diamond mining has to be in using intelligent, sensor-based sorting that can separate diamonds from rock. MinPET is the obvious solution – eventually allowing marginal mines to become profitable and the lifetime of existing mines to be extended. And although today’s synthetic diamonds offer serious competition, natural stones are unique, billions of years old, and came to the surface in a violent fiery eruption as part of a galactic narrative. They will always hold their romantic appeal, and so will always be sought after.

The post Discovering diamonds appeared first on CERN Courier.

]]>
Feature https://cerncourier.com/wp-content/uploads/2018/06/CCdia1_06_17.jpg
SESAME sees first beam https://cerncourier.com/a/sesame-sees-first-beam/ Wed, 15 Feb 2017 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/sesame-sees-first-beam/ Late in the evening of 12 January, a beam of electrons circulated for the first time in the SESAME light source in Jordan. Following the first single turn, the next steps will be to achieve multi-turns, store and then accelerate a beam. This is an important milestone towards producing intense beams of synchrotron light at the […]

The post SESAME sees first beam appeared first on CERN Courier.

]]>

Late in the evening of 12 January, a beam of electrons circulated for the first time in the SESAME light source in Jordan. Following the first single turn, the next steps will be to achieve multi-turns, store and then accelerate a beam. This is an important milestone towards producing intense beams of synchrotron light at the pioneering facility, which is the first light-source laboratory in the Middle East.

SESAME, which stands for Synchrotron-light for Experimental Science and Applications in the Middle East, will eventually operate several beamlines at different wavelengths for wide-ranging studies of the properties of matter. Experiments there will enable SESAME users to undertake research in fields ranging from medicine and biology, through materials science, physics and chemistry to healthcare, the environment, agriculture and archaeology.

CERN has a long-standing involvement with SESAME, notably through the European Commission-funded CESSAMag project, coordinated by CERN. This project provided the magnet system for SESAME’s 42 m-diameter main ring and brought CERN’s expertise in accelerator technology to the facility in addition to training, knowledge and technology transfer.

The January milestone follows a series of key events, beginning with the establishment of a Middle East Scientific Collaboration group in the mid-1990s. This was followed by the donation of the BESSY1 accelerator by the BESSY laboratory in Berlin. Refurbished and upgraded components of BESSY1 now serve as the injector for the completely new SESAME main ring, which is a competitive third-generation light source built by SESAME with support from the SESAME members, as well as the European Commission and CERN through CESSAMag, and Italy.

There is still a lot of work to be done before experiments can get underway. Beams have to be accelerated to SESAME’s operating energy of 2.5 GeV. Then the synchrotron light emitted as the beams circulate has to be channelled along SESAME’s two initial beamlines and optimised for the experiments that will take place there. This process is likely to take around six months, leading to first experiments in the summer of 2017.

The post SESAME sees first beam appeared first on CERN Courier.

]]>
News https://cerncourier.com/wp-content/uploads/2018/06/CCnew3_02_17.jpg
GSI ions target irregular heartbeat https://cerncourier.com/a/gsi-ions-target-irregular-heartbeat/ Wed, 15 Feb 2017 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/gsi-ions-target-irregular-heartbeat/ Researchers at the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, Germany, have demonstrated the feasibility of using carbon ions to treat cardiac arrhythmia, in which abnormal electrical patterns can lead to sudden heart failure or permanent damage as a result of stroke. Conventional treatments for certain forms of cardiac arrhythmia include drugs or […]

The post GSI ions target irregular heartbeat appeared first on CERN Courier.

]]>

Researchers at the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, Germany, have demonstrated the feasibility of using carbon ions to treat cardiac arrhythmia, in which abnormal electrical patterns can lead to sudden heart failure or permanent damage as a result of stroke. Conventional treatments for certain forms of cardiac arrhythmia include drugs or “catheter ablation,” in which catheters are guided through blood vessels to the heart to destroy certain tissue. The GSI team, in conjunction with physicians from Heidelberg University and the Mayo Clinic in the US, have now shown that high-energy carbon ions produced by a particle accelerator can in principle be used to perform such treatments without catheters.

The non-invasive procedure induces specific changes to cardiac tissue that prevent the transmission of electrical signals, permanently interrupting the propagation of disruptive impulses. Following promising results from initial tests on cardiac cell cultures and beating-heart preparations, the researchers developed an animal study. Further detailed studies are needed, however, before the method can start to benefit patients.

A crucial advantage of the new method it that the ions can penetrate to any desired depth. Irradiating cardiac tissue with carbon ions appears as a promising, non-invasive alternative to catheters, and ultimately ion-based procedures are expected to take a few minutes compared with a few hours. “It is exciting that the carbon beam could work with surgical precision in particularly sensitive areas of the body,” says Paolo Giubellino, scientific managing director of FAIR and GSI and former spokesperson of the LHC’s ALICE experiment at CERN. “We’re proud that the first steps toward a new therapy have now been taken.”

The post GSI ions target irregular heartbeat appeared first on CERN Courier.

]]>
News https://cerncourier.com/wp-content/uploads/2018/06/CCnew16_02_17.jpg
Developing medical linacs for challenging regions https://cerncourier.com/a/developing-medical-linacs-for-challenging-regions/ Wed, 15 Feb 2017 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/developing-medical-linacs-for-challenging-regions/ Physicists, oncologists and industry experts are defining the design of a novel linear accelerator that will make radiotherapy more readily available in lower-resourced countries.

The post Developing medical linacs for challenging regions appeared first on CERN Courier.

]]>

The annual global incidence of cancer is expected to rise from 15 million cases in 2015 to as many as 25 million cases in 2035. Of these, it is estimated that 65–70% will occur in low-and middle-income countries (LMICs) where there is a severe shortfall in radiation treatment capacity. The growing burden of cancer and other non-communicable diseases in these countries has been recognised by the United Nations General Assembly and the World Health Organization.

Radiation therapy is an essential component of effective cancer control, and approximately half of all cancer patients – regardless of geographic location – would benefit from such treatment. The vast majority of modern radiotherapy facilities rely on linear accelerators (linacs) to accelerate electrons, which are either used directly to treat superficial tumours or are directed at targets such as tungsten to produce X-rays for treating deep-seated tumours.

Electron linacs were first used clinically in the 1950s, in the UK and the US. Since then, great advances in photon treatment have been made. These are due to improved imaging, real-time beam shaping and intensity modulation of the beam with multileaf collimators, and knowledge of the radiation doses to kill tumours alone and in combination with drugs. In addition, the use of particle beams means that radiotherapy directly benefits from knowledge and technology gained in high-energy-physics research.

Meeting global demand

In September 2015, the Global Task Force on Radiotherapy for Cancer Control (GTFRCC) released a comprehensive study of the global demand for radiation therapy. It highlighted the inadequacy of current equipment coverage (image at top) and the resources required, as well as the costs and economic and societal benefits of improving coverage.

Limiting factors to the development and implementation of radiotherapy in lower-resourced nations include the cost of equipment and infrastructure, and the shortage of trained personnel to properly calibrate and maintain the equipment and to deliver high-quality treatment. The GTFRCC report estimated that as many as 12,600 megavolt-class treatment machines will be needed to meet radiotherapy demands in LMICs by 2035. Based on current staffing models, it was estimated that an additional 30,000 radiation oncologists, more than 22,000 medical physicists and almost 80,000 radiation technologists will be required.

Approximately three years ago, with the aim of making cancer treatments accessible to underserved populations, initial discussions took place between CERN and representatives of the US National Cancer Institute and an emerging non-government organisation, the International Cancer Expert Corps (ICEC), whose aim is to help LMICs establish in-country cancer-care expertise. The focus of discussions was on an “out-of-the-box” concept for global health, specifically the design of a novel, possibly modular, linear accelerator for use in challenging environments (defined as those in which the general infrastructure is poor or lacking, where power outages and water-supply fluctuations can occur, and where climatic conditions might be harsh). Following further activities, CERN hosted a workshop in November 2016 convened by the ICEC, which brought together invited experts from many disciplines including industry (see panel below).

In addition to improving the quality of care for cancer patients globally, linac-based radiotherapy systems also reduce the reliance on less expensive and simpler systems that provide treatment with photons from radionuclide sources such as 60Co and 137Cs. While some of the 60Co units have multileaf collimators for improved beam delivery, they do not have the advanced features of modern linacs. Eliminating radionuclides also reduces the risk of malicious use of medical radioactive materials (see panel below).

Design characteristics

It is important that the newly designed linac retains the advanced capability of the machines now in use, and that through software advances, resource sharing and sustainable partnerships, the treatments in LMICs are of comparable quality to those in upper-income countries. This not only avoids substandard care but is also an incentive for experts to go to and remain in LMICs.

CERN workshop initiates discussions for novel medical linacs

On 7–8 November 2016, CERN hosted a first-of-its-kind workshop to discuss the design characteristics of radiotherapy linacs for low- and middle-income countries (LMICs). Around 75 participants from 15 countries addressed: the role of radiotherapy in treating cancer in challenging environments and the related security of medical radiological materials, especially 60Co and 137Cs; the design requirements of linear accelerators and related technologies for use in challenging environments; the education and training of a sustainable workforce needed to utilise novel radiation treatment systems; and the cost and financing of the project. Leading experts were invited from international organisations, government agencies, research institutes, universities and hospitals, and companies that produce equipment for conventional X-ray and particle therapy.

The ideal radiation-therapy treatment system for LMICs is thought to be as modular as possible, so that it can be easily shipped, assembled in situ, repaired and upgraded as local expertise in patient treatment develops. Another critical issue concerns the sustainability of treatment systems after installation. To minimise the need for local specialised technical staff to maintain and promptly repair facilities, procedures and economic models need to be developed to ensure regional technical expertise and also a regional supply of standard spare parts and simpler (modular) replacement procedures. Difficulties due to remoteness and poor communication also need to be considered.

There are several design considerations when developing a linear accelerator for operation in challenging environments. In addition to ease of operation, repair and upgradability, key factors include reliability, self-diagnostics, insensitivity to power interruptions, low power requirements and reduced heat production. To achieve most of these design considerations relatively quickly requires a system based on current hardware technology and software that fully exploits automation. The latter should include auto-planning and operator monitoring and training, even to the point of having a treatment system that depends on limited on-site human involvement, to allow high-quality treatment to be delivered by an on-site team with less technical expertise.

Current technology can be upgraded with software upgrades, but generally it requires the purchase of an entire new unit to substantially improve technology – often costing many millions of dollars. A modular design that allows major upgrades of components on the same base unit could be much less expensive. Major savings would also result from developing new advanced software to expand the capability of the hardware.

Participants in the CERN workshop agreed that we need to develop a treatment machine that delivers state-of-the-art radiation therapy, rather than to develop a sub-standard linac in terms of the quality of the treatment it could deliver. The latter approach would not only provide lower-quality treatment but would be a disincentive for recruitment and retention of high-quality staff. As used in virtually all industries, the user interface should be developed through interaction with the users. Improved hardware such as a power generator in conjunction with energy management should also be provided to control electrical network fluctuations.

The task ahead

Experience from past and current radiation-therapy initiatives suggests that successful radiotherapy programmes require secure local resources, adequate planning, local commitment and political stability. To make a highly functional radiotherapy treatment system available in the near-term, one could upgrade one or more existing linear accelerators with software optimisations. The design and development of a truly novel radiation treatment system, on the other hand, will require a task force to refine the design criteria and then begin development and production.

Treatment, not terror

With the rise in global terrorism comes the threat of the use of un- or poorly secured radioactive sources that would have enormous health, economic and political consequences. This includes medical sources such as 60Co that are generally not highly protected, many of which are located in relatively under-resourced regions. Interest in developing alternative technologies has brought together medical practitioners who currently use these sources, governmental and global agencies whose mission includes the security of radiological and nuclear material, and organisations dedicated to the non-proliferation of nuclear weapons.

This confluence of expertise resulted in meetings in Brazil and South Africa in 2016, with the realisation that simply removing 60Co would leave people in many regions without cancer care. Removing dangerous sources while establishing a better cancer-care environment would require education, training, mentorship and partnerships to use more complex linear-accelerator-based radiotherapy systems. The austerity of the environment is a challenge that requires new thinking, however.

The ability to offer a state-of-the-art non-isotopic radiation treatment system for challenging environments was emphasised by the Office of Radiological Security of the US National Nuclear Security Administration, which is responsible for reducing the global reliance on radioactive sources as well as protecting those sources from unauthorised access. The benefit of replacing 60Co radiation treatment units with linear accelerators from the point of view of decreasing the risk of malicious use of 60Co by non-state (terrorist) actors was also emphasised in a report from the Center for Nonproliferation Studies that offered the new paradigm “treatment, not terror”.

Following the November workshop, an oversight committee and three task forces have been established. A technology task force will focus on systems solutions and novel technology for a series of radiation-treatment systems that incorporate intelligent software and are modular, rugged and easily operated yet sufficiently sophisticated to also benefit therapy in high-income countries. A second task force will identify education and training requirements for the novel treatment systems, in addition to evaluating the impact of evolving treatment techniques, changes in cancer incidence and the population mix. Finally, a global connectivity and fundraising task force will develop strategies for securing financial support in client countries as well as from governmental, academic and philanthropic organisations and individuals.

The overall aim of this ambitious project is to make excellent near-term and long-term radiation treatment systems, including staffing and physical infrastructure, available for the treatment of cancer patients in LMICs and other geographically underserved regions in the next 5–10 years. The high-energy physics community’s broad expertise in global networking, technology innovation and open-source knowledge for the benefits of health are essential to the progress of this ambitious effort. It is anticipated that an update meeting will take place at the International Conference on Advances in Radiation Oncology (ICARO2) to be held in Vienna in June 2017.

The post Developing medical linacs for challenging regions appeared first on CERN Courier.

]]>
Feature Physicists, oncologists and industry experts are defining the design of a novel linear accelerator that will make radiotherapy more readily available in lower-resourced countries. https://cerncourier.com/wp-content/uploads/2018/06/CClin1_02_17.jpg
CLOUD experiment sharpens climate predictions https://cerncourier.com/a/cloud-experiment-sharpens-climate-predictions/ Fri, 11 Nov 2016 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cloud-experiment-sharpens-climate-predictions/ Future global climate projections have been put on more solid empirical ground, thanks to new measurements of the production rates of atmospheric aerosol particles by CERN’s Cosmics Leaving OUtdoor Droplets (CLOUD) experiment. According to the Intergovernmental Panel on Climate Change, the Earth’s mean temperature is predicted to rise by between 1.5–4.5 °C for a doubling of […]

The post CLOUD experiment sharpens climate predictions appeared first on CERN Courier.

]]>

Future global climate projections have been put on more solid empirical ground, thanks to new measurements of the production rates of atmospheric aerosol particles by CERN’s Cosmics Leaving OUtdoor Droplets (CLOUD) experiment.

According to the Intergovernmental Panel on Climate Change, the Earth’s mean temperature is predicted to rise by between 1.5–4.5 °C for a doubling of carbon dioxide in the atmosphere, which is expected by around 2050. One of the main reasons for this large uncertainty, which makes it difficult for society to know how best to act against climate change, is a poor understanding of aerosol particles in the atmosphere and their effects on clouds.

To date, all global climate models use relatively simple parameterisations for aerosol production that are not based on experimental data, in contrast to the highly detailed modelling of atmospheric chemistry and greenhouse gases. Although the models agree with current observations, predictions start to diverge when the models are wound forward to project the future climate.

Now, data collected by CLOUD have been used to build a model of aerosol production based solely on laboratory measurements. The new CLOUD study establishes the main processes responsible for new particle formation throughout the troposphere, which is the source of around half of all cloud seed particles. It could therefore reduce the variation in projected global temperatures as calculated by complex global-circulation models.

“This marks a big step forward in the reliability and realism of how models describe aerosols and clouds,” says CLOUD spokesperson Jasper Kirkby. “It’s addressing the largest source of uncertainty in current climate models and building it on a firm experimental foundation of the fundamental processes.”

Aerosol particles form when certain trace vapours in the atmosphere cluster together, and grow via condensation to a sufficient size that they can seed cloud droplets. Higher concentrations of aerosol particles make clouds more reflective and long-lived, thereby cooling the climate, and it is thought that the increased concentration of aerosols caused by air pollution since the start of the industrial period has offset a large part of the warming caused by greenhouse-gas emissions. Until now, however, the poor understanding of how aerosols form has hampered efforts to estimate the total forcing of climate from human activities.

Thanks to CLOUD’s unique controlled environment, scientists can now understand precisely how new particles form in the atmosphere and grow to seed cloud droplets. In the latest work, published in Science, researchers built a global model of aerosol formation using extensive laboratory-measured nucleation rates involving sulphuric acid, ammonia, ions and organic compounds. Although sulphuric acid has long been known to be important for nucleation, the results show for the first time that observed concentrations of particles throughout the atmosphere can be explained only if additional molecules – organic compounds or ammonia – participate in nucleation. The results also show that ionisation of the atmosphere by cosmic rays accounts for nearly one-third of all particles formed, although small changes in cosmic rays over the solar cycle do not affect aerosols enough to influence today’s polluted climate significantly.

Early this year, CLOUD reported in Nature the discovery that aerosol particles can form in the atmosphere purely from organic vapours produced naturally by the biosphere (CERN Courier July/August 2016 p11). In a separate modelling paper published recently in PNAS, CLOUD shows that such pure biogenic nucleation was the dominant source of particles in the pristine pre-industrial atmosphere. By raising the baseline aerosol state, this process significantly reduces the estimated aerosol radiative forcing from anthropogenic activities and, in turn, reduces modelled climate sensitivities.

“This is a huge step for atmospheric science,” says lead-author Ken Carslaw of the University of Leeds, UK. “It’s vital that we build climate models on experimental measurements and sound understanding, otherwise we cannot rely on them to predict the future. Eventually, when these processes get implemented in climate models, we will have much more confidence in aerosol effects on climate. Already, results from CLOUD suggest that estimates of high climate sensitivity may have to be revised downwards.”

The post CLOUD experiment sharpens climate predictions appeared first on CERN Courier.

]]>
News
European XFEL enters commissioning phase https://cerncourier.com/a/european-xfel-enters-commissioning-phase/ https://cerncourier.com/a/european-xfel-enters-commissioning-phase/#respond Fri, 11 Nov 2016 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/european-xfel-enters-commissioning-phase/ On 6 October, commissioning began at the world’s largest X-ray laser: the European XFEL in Hamburg, Germany. The 3.4 km-long European XFEL will generate ultrashort X-ray flashes with a brilliance one billion times greater than the best conventional X-ray radiation sources based on synchrotrons. The beams will be directed towards samples at a rate of 27,000 […]

The post European XFEL enters commissioning phase appeared first on CERN Courier.

]]>

On 6 October, commissioning began at the world’s largest X-ray laser: the European XFEL in Hamburg, Germany. The 3.4 km-long European XFEL will generate ultrashort X-ray flashes with a brilliance one billion times greater than the best conventional X-ray radiation sources based on synchrotrons. The beams will be directed towards samples at a rate of 27,000 flashes per second, allowing scientists from a broad range of disciplines to study the atomic structure of materials and to investigate ultrafast processes in situ. Commissioning will take place over the next few months, with external scientists able to perform first experiments in summer 2017.

The linear accelerator that drives the European XFEL is based on superconducting “TESLA” technology, which has been developed by DESY and its international partners. Since 2005, DESY has been operating a free-electron laser called FLASH, which is a 260 m-long prototype of the European XFEL that relies on the same technology.

The European XFEL is managed by 11 member countries: Denmark, France, Germany, Hungary, Italy, Poland, Russia, Slovakia, Spain, Sweden and Switzerland. On 1 January 2017, surface-physicist Robert Feidenhans’l, currently head of the Niels Bohr Institute at the University of Copenhagen, was appointed as the new chairman of the European XFEL management board taking over from Massimo Altarelli, who had been in the role since 2009.

The post European XFEL enters commissioning phase appeared first on CERN Courier.

]]>
https://cerncourier.com/a/european-xfel-enters-commissioning-phase/feed/ 0 News
Crystal Clear celebrates 25 years of success https://cerncourier.com/a/crystal-clear-celebrates-25-years-of-success/ https://cerncourier.com/a/crystal-clear-celebrates-25-years-of-success/#respond Fri, 14 Oct 2016 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/crystal-clear-celebrates-25-years-of-success/ Advanced scintillating materials have found their way into novel detectors for physics and medicine.

The post Crystal Clear celebrates 25 years of success appeared first on CERN Courier.

]]>
3D PET/CT image

The Crystal Clear (CC) collaboration was approved by CERN’s Detector Research and Development Committee in April 1991 as experiment RD18. Its objective was to develop new inorganic scintillators that would be suitable for electromagnetic calorimeters in future LHC detectors. The main goal was to find dense and radiation-hard scintillating material with a fast light emission that can be produced in large quantities. This challenge required a large multidisciplinary effort involving world experts in different aspects of material sciences – including crystallography, solid-state physics, luminescence and defects in solids.

From 1991 to 1994, the CC collaboration carried out intensive studies to identify the most adequate scintillator material for the LHC experiments. Three candidates were identified and extensively studied: cerium fluoride (CeF3), lead tungstate (PbWO4) and heavy scintillating glass. In 1994, lead tungstate was chosen by the CMS and ALICE experiments as the most cost-effective crystal compliant with the operational conditions at the LHC. Today, 75,848 lead-tungstate crystals are installed in CMS electromagnetic calorimeters and 17,920 in ALICE. The former contributed to the discovery of the Higgs boson, which was identified in 2012 by CMS and the ATLAS experiment via its decay, among others, into two photons. The CC collaboration’s generic R&D on scintillating materials has brought a deep understanding of cerium ions for scintillating activators and seen the development of lutetium and yttrium aluminium perovskite crystals for both physics and medical applications.

From physics to medicine

In 1997, the CC collaboration made its expertise in scintillators available to industry and society at large. Among the most promising sectors were medical functional imaging and, in particular, positron emission tomography (PET), due to its growing importance in cancer diagnostics and similarities with the functionality of electromagnetic calorimeters (the principle of detecting gamma rays in a PET scanner is identical to that in high-energy physics detectors).

Following this, CC collaboration members developed and constructed several dedicated PET prototypes. The first, which was later commercialised by Raytest GmbH in Germany under the trademark ClearPET, was a small-animal PET machine used for radiopharmaceutical research. At the turn of the millennium, five ClearPET prototypes characterised by a spatial resolution of 1.5 mm were built by the CC collaboration, which represented a major breakthrough in functional imaging at that time. The same crystal modules were also developed by the CC team at Forschungszentrum Jülich, Germany, to image plants in order to study carbon transport. A modified ClearPET geometry was also combined with X-ray single-photon detectors by CC researchers at CPPM Marseille, offering simultaneous PET and computed-tomography (CT) acquisition, and providing the first PET/CT simultaneous images of a mouse in 2015 (see image above). The simultaneous use of CT and PET allows the excellent position resolution of anatomic imaging (providing detailed images of the structure of tissues) to be combined with functional imaging, which is sensitive to the tissue’s metabolic activity.

After the success of ClearPET, in 2002, CC developed a dedicated PET camera for breast imaging called ClearPEM. This system had a spatial resolution of 1.3 mm and represented the first PET imaging based on avalanche photodiodes, which were initially developed for the CMS electromagnetic calorimeter. The machine was installed in Coimbra, Portugal, where clinical trials were performed. In 2005, a second ClearPEM machine combined with 3D ultrasound and elastography was developed with the aim of providing anatomical and metabolic information to allow better identification of tumours. This machine was installed in Hôpital Nord in Marseille, France, in December 2010 for clinical evaluations of 10 patients, and three years later it was moved to the San Girardo hospital in Monza, Italy, to undertake larger clinical trials, which are ongoing.

In 2011, a European FP7 project called EndoTOFPET-US, which was a consortium of three hospitals, three companies and six institutes, began the development of a prototype for a novel bi-modal time-of-flight PET and ultrasound endoscope with a spatial resolution better than 1 mm and a time resolution of 200 ps. This was aimed at the detection of early stage pancreatic or prostatic tumours and the development of new biomarkers for pancreatic and prostatic cancers. Two prototypes have been produced (one for pancreatic and one for prostate cancers) and the first tests on a phantom-prostate prototype were performed in spring 2015 at the CERIMED centre in Marseille. Work is now ongoing to improve the two prototypes, in view of preclinical and clinical operation.

In addition to the development of ClearPET detectors, members of the collaboration have initiated the development of the Monte Carlo simulation software-package GATE, a GEANT4-based simulation tool allowing the simulation of full PET detector systems.

Clear impact

In 1992, the CC collaboration organised the first international conference on inorganic scintillators and their applications, which led to a global scientific community of around 300 people. Today, this community comes together every two years at the SCINT conferences, the next instalment of which will take place in Chamonix, France, from 18 to 22 September 2017.

To this day, the CC collaboration continues its investigations into new scintillators and understanding their underlying scintillation mechanisms and radiation-hardness characteristics – in addition to the development of detectors. Among its most recent activities is the investigation of key parameters in scintillating detectors that enable very precise timing information for various applications. These include mitigating the effect of “pile-up” caused by the high event rate at particle accelerators operating at high peak luminosities, and also medical applications in time-of-flight PET imaging. This research requires the study of new materials and processes to identify ultrafast scintillation mechanisms such as “hot intraband luminescence” or quantum-confined excitonic emission with sub-picosecond rise time and sub-nanosecond decay time. It also involves investigating the enhancement of the scintillator light collection by using various surface treatments, such as nano-patterning with photonic crystals. CC recently initiated a European COST Action called Fast Advanced Scintillator Timing (FAST) to bring together European experts from academia and industry to ultimately achieve scintillator-based detectors with a time precision better than 100 ps, which provides an excellent training opportunity for researchers interested in this domain.

Among other recent activities of the CC collaboration are new crystal-production methods. Micro-pulling-down techniques, which allow inorganic scintillating crystals to be grown in the shape of fibres with diameters ranging from 0.3 to 3 mm, open the way to attractive detector designs for future high-energy physics experiments by replacing a block of crystals with a bundle of fibres. A Horizon 2020 European RISE Marie Skłodowska-Curie project called Intelum has been set up by the CC collaboration to explore the cost-effective production of large quantities of fibres. More recently, the development of new PET crystal modules has been launched by CC collaborators. These make use of new photodetector silicon photomultipliers and have a high spatial resolution (1.5 mm), depth-of-interaction capability (better than 3 mm) and a fast timing resolution (better than 200 ps).

Future directions

For the past 25 years, the CC collaboration has actively carried out R&D on scintillating materials, and investigated their use in novel ionising radiation-detecting devices (including read-out electronics and data acquisition) for use in particle-physics and medical-imaging applications. In addition to significant progress made in the understanding of scintillation mechanisms and radiation hardness of different materials, the choice of lead tungstate for the CMS electromagnetic calorimeter and the realisation of various prototypes for medical imaging are among the CC collaboration’s highlights so far. It is now making important contributions to understanding the key parameters for fast-timing detectors.

The various activities of the CC collaboration, which today has 29 institutional members, have resulted in more than 650 publications and 72 PhD theses. The motivation of CC collaboration members and the momentum generated throughout its many projects open up promising perspectives for the future of inorganic scintillators and their use in HEP and other applications.

• An event to celebrate the 25th anniversary of the CC collaboration will take place at CERN on 24 November.

The post Crystal Clear celebrates 25 years of success appeared first on CERN Courier.

]]>
https://cerncourier.com/a/crystal-clear-celebrates-25-years-of-success/feed/ 0 Feature Advanced scintillating materials have found their way into novel detectors for physics and medicine. https://cerncourier.com/wp-content/uploads/2016/10/CCcry1_09_16.jpg
Proton therapy enters precision phase https://cerncourier.com/a/proton-therapy-enters-precision-phase/ https://cerncourier.com/a/proton-therapy-enters-precision-phase/#respond Fri, 16 Sep 2016 13:55:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/proton-therapy-enters-precision-phase/ Résumé Protonthérapie : l’ère de la précision La protonthérapie est une technique de radiothérapie innovante, qui peut traiter des tumeurs avec beaucoup plus de précision que les rayons X ou les rayons gamma. Le nombre de centres de traitement par protonthérapie augmente rapidement, et offre aux patients des traitements plus efficaces avec un risque de […]

The post Proton therapy enters precision phase appeared first on CERN Courier.

]]>
Résumé

Protonthérapie : l’ère de la précision

La protonthérapie est une technique de radiothérapie innovante, qui peut traiter des tumeurs avec beaucoup plus de précision que les rayons X ou les rayons gamma. Le nombre de centres de traitement par protonthérapie augmente rapidement, et offre aux patients des traitements plus efficaces avec un risque de complications moindre. Au Centre Antoine Lacassagne, à Nice, une nouvelle installation de protonthérapie de haute énergie, qui tire son origine d’une collaboration avec le CERN vieille de 30 ans, se prépare à présent à traiter son premier patient. À performance égale, son accélérateur est quatre fois plus léger et consomme huit fois moins d’énergie que les machines actuelles, et il peut traiter tous types de tumeurs situées profondément à l’intérieur du corps humain.

Each year, millions of people worldwide undergo treatment for cancer based on focused beams of high-energy photons. Produced by electron linear accelerators (linacs), photons with energies in the MeV range are targeted on cancerous tissue where they indirectly ionise DNA atoms and therefore reduce the ability of cells to reproduce. Photon therapy has been in clinical use for more than a century, following the discovery of X-rays by Roentgen in 1896, and has helped to save or improve the quality of countless lives.

Proton therapy, which is a subclass of particle or hadron therapy, is an innovative alternative technique in radiotherapy. It can treat tumours in a much more precise manner than X- or gamma-rays because the radiation dose of protons is ballistic: protons have a definite range characterised by the Bragg peak, which depends on their energy. This initial ballistic advantage gives protons their advantage over X-rays to provide a dose deposition that better matches tumour contours while limiting the dose in the vicinity. This property, which was first identified by accelerator-pioneer Robert Wilson in 1946 when he was involved in the design of the Harvard Cyclotron Laboratory, results in a greater treatment efficiency  and a lower risk of complications.

The pioneers of proton therapy used accelerators from physics laboratories at locations including Uppsala in Sweden in 1957; Boston Harvard Cyclotron Laboratory in the US in 1961; and the Swiss Institute for Nuclear Research in Switzerland in 1984. The first dedicated clinical proton-therapy facility, which was driven by a low-energy cyclotron, was inaugurated in 1989 at the Clatterbridge Centre for Oncology in the UK. The following year, a dedicated synchrotron designed at Fermilab began operating in the US at the Loma Linda University Medical Center in California. By the early 2000s, the number of treatment centres had risen to around 20, and today proton therapy is booming: some 45 facilities are in operation worldwide, with around 20 under construction and a further 30 at the planning stage in various countries around the world (see www.ptcog.ch).

Towards MEDICYC

Modern proton therapy exploits an active technique called pencil-beam scanning, which creates a pointillist 3D tumour-volume painting by displacing the proton beam with appropriate magnets. Moreover, different irradiation ports are generally possible thanks to rotating gantries. This delivery technique is competitive with the most advanced forms of X-ray irradiation, such as intensity-modulated radiation therapy (IMRT), tomotherapy, cyberknife  and others, because it uses a smaller number of entering ports and hence reduces the overall absorbed dose to the patient.

Owing to its high dose accuracy, proton therapy has historically been oriented towards the treatment of uveal melanoma and base-of-skull tumours, for which X-rays are less efficient. Today, however, proton therapy is used to treat any tumour type with a predilection for paediatric treatment. Indeed, by limiting the integral dose to an absolute minimum at the whole-body level, the side effects of radiotherapy occurring from radiation-induced cancer are reduced to a minimum.

Particle physics, and CERN in particular, has played a key role in the success of proton therapy. One of the first facilities operating in Europe was MEDICYC – a 65 MeV proton medical cyclotron that was initially devoted to neutron production for cancer therapy. It was installed at the Centre Antoine Lacassagne (CAL) in Nice in 1991, where the first proton-therapy treatment for ocular melanoma was achieved in France. MEDICYC was designed by a small team of young CAL members hosted by CERN in the PS division, and the advice of the passionate experts there was key to the success of this accelerator. Preliminary studies for MEDICYC and the first test of the radiofrequency accelerating system were performed at CERN. Indeed, because the cyclotron was completed before the building that would house it, it was proposed to assemble the cyclotron magnet at CERN in the East Hall of the PS division, to perform the magnetic-measurement campaigns.

During its 25 year operational lifetime, which began in June 1991, MEDICYC has reached a high level of reliability and successfully treated more than 5500 patients for various ocular tumours with a 96% local control rate. Owing to its high-dose-profile quality (0.8 mm dose fall-off beyond the Bragg peak, which is of the utmost importance for irradiating tumours close to the optical nerve), MEDICYC will continue to run its medium-energy proton-therapy programme. Moreover, CAL is investigating a MEDICYC improvement programme for increasing the beam intensities in view of new medical-isotope production at high energies with protons and deuterons.

On 30 June this year, a new proton therapy centre called the Institut Méditerranéen de Protonthérapie (IMPT) was inaugurated at CAL, marking a new phase in European advanced hadron therapy. Joining MEDICYC as the driver of this new facility is a new cyclotron called the Superconducting Synchro-cyclotron (S2C2). This facility, which will expand the proton-therapy activity of MEDICYC, uses the latest technology to precisely target tumours while controlling the intensity and spatial distribution of the dose with fine precision. It is therefore ideal for treating base of skull, head and neck, sarcomas tumours and with priority oncopediatics tumours, and is expected to treat up to 250 patients per year in its first phase.

CERN beginnings

The new facility at CAL has its roots in a CERN-led project called EULIMA (European Light Ions Medical Accelerator) – a joint initiative at the end of the 1980s to bring the potential benefit of hadron therapy with light ions to cancer patients in Europe. Historically, CAL was involved with several European institutes to undertake feasibility studies for EULIMA. The feasibility study group was hosted by CERN and the main design option for the accelerator was a four-magnetic-sector cyclotron with a single large cylindrical superconducting excitation coil designed by CERN magnet-specialist Mario Morpurgo. CAL was selected as a candidate site to host the EULIMA prototype because it offered both adequate space in the MEDICYC building to house the machine and treatment rooms, while also offering an adequate supply of medical, scientific and technical staff in an attractive site.

When the EULIMA came to an end in 1992, the empty EULIMA hall was available for future development projects in high-energy proton therapy. Therefore, in 2011, we were able to construct the new S2C2 facility at CAL at low cost. This compact, approximately 40 tonne facility provides proton beams with an energy of 230 MeV and delivers its dose using dynamic pencil-beam scanning (PBS). Its design is the result of a collaboration between AIMA (a spin-out company from CAL) and Belgian medical firm IBA.

The facility comprises a beamline that feeds an R&D room for research teams, which have decided to commit themselves to a national research programme called France Hadron. The programme gathers several hadron-therapy centres based at Paris-Orsay, Lyon, Caen, Toulouse and Nice, in addition to several universities and national public research institutions, to co-ordinate and organise a national programme of research and training in hadron therapy. This programme aims at bringing nuclear-physics techniques to clinical research through dosimetry, radiation biology, imaging, control of target positioning, and quality-control instruments.

As is the case for eye treatment at MEDICYC, the new facility will operate in co-operation with the Léon Bérard cancer centre in Lyon and other oncologic centres in the south of France. The new high-energy proton facility displays many innovative technological breakthroughs compared with existing systems. The accelerator is four-times lighter and consumes eight-times less energy than current machines for the same performance, and its maximum energy of 230 MeV can treat all tumours deep in the human body up to a depth of 32 cm. Its significantly lower cost represents a particularly attractive alternative compared with the global industrial standard. It also foreshadows a major development of proton therapy in the coming years, because compact synchrocyclotron technology is also being developed for the acceleration of alpha particles and heavier ions for hadron therapy.

A major innovation is its rotating compact gantry, the first prototype of which was installed in the US in 2013. The new beamline has a mobility that allows operators to direct the radiation beam in different incidences around the patient and offer unprecedented compactness, reducing costs further. The new S2C2 and the future upgrading programme of MEDICYC embody the medical mission of CAL at large by bringing together advanced proton therapy for treating patients and scientific research activities with multidisciplinary teams of medical physicists and radiobiologists.

The post Proton therapy enters precision phase appeared first on CERN Courier.

]]>
https://cerncourier.com/a/proton-therapy-enters-precision-phase/feed/ 0 Feature
CERN to produce radioisotopes for health https://cerncourier.com/a/cern-to-produce-radioisotopes-for-health/ https://cerncourier.com/a/cern-to-produce-radioisotopes-for-health/#respond Fri, 16 Sep 2016 13:55:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-to-produce-radioisotopes-for-health/ Résumé Le CERN produira des radio-isotopes pour la médecine Le lien entre les communautés des accélérateurs et de la médecine remonte à presque 50 ans. Aujourd’hui, alors que les physiciens développent la nouvelle génération de machines pour la recherche, les médecins imaginent de nouvelles méthodes pour diagnostiquer et traiter les maladies neurodégénératives et les cancers. […]

The post CERN to produce radioisotopes for health appeared first on CERN Courier.

]]>
Résumé

Le CERN produira des radio-isotopes pour la médecine

Le lien entre les communautés des accélérateurs et de la médecine remonte à presque 50 ans. Aujourd’hui, alors que les physiciens développent la nouvelle génération de machines pour la recherche, les médecins imaginent de nouvelles méthodes pour diagnostiquer et traiter les maladies neurodégénératives et les cancers. Le projet MEDICIS du CERN vise à développer de nouveaux isotopes pouvant être utilisés à la fois comme agents de diagnostic et pour la curiethérapie ou la radiothérapie interne avec source non scellée, pour le traitement de cancers du cerveau ou du pancréas non opérables et d’autres formes de cette maladie. L’installation, dont l’idée a germé en 2010 et qui sera opérationnelle en 2017, utilise un faisceau de protons et l’installation de faisceaux d’ions radioactifs ISOLDE pour produire des isotopes médicaux. Ces isotopes seront d’abord destinés à des hôpitaux et des centres de recherche en Suisse, puis progressivement à d’autres laboratoires en Europe et ailleurs dans le monde.

Accelerators and their related technologies have long been developed at CERN to undertake fundamental research in nuclear physics, probe the high-energy frontier or explore the properties of antimatter. Some of the spin-offs of this activity have become key to society. A famous example is the World Wide Web, while another is medical applications such as positron emission tomography (PET) scanner prototypes and image reconstruction algorithms developed in collaboration between CERN and Geneva University Hospitals in the early 1990s. Today, as accelerator physicists develop the next-generation radioactive beam facilities to address new questions in nuclear structure – in particular HIE-ISOLDE at CERN, SPIRAL 2 at GANIL in France, ISOL@Myrrha at SCK•CEN in Belgium and SPES at INFN in Italy – medical doctors are devising new approaches to diagnose and treat diseases such as neurodegenerative disorders and cancers.

The bridge between the radioactive-beam and medical communities dates back to the late 1970s, when radioisotopes collected from a secondary beam at CERN’s Isotope mass Separator On-Line facility (ISOLDE) were used to synthesise an injectable radiopharmaceutical in a patient suffering from cancer. 167Tm-citrate, a radiolanthanide associated to a chelating chemical, was used to perform a PET image of a lymphoma, which revealed the spread-out cancerous tumours. While PET became a reference protocol to provide quantitative imaging information, several other pre-clinical and pilot clinical tests have been performed with non-conventional radioisotopes collected at radioactive-ion-beam facilities – both for diagnosis and for therapeutic applications.

Despite significant progress made in the past decade in the field of oncology, however, the prognosis of certain tumours is still poor – particularly for patients presenting advanced glioblastoma multiforme (a form of very aggressive brain cancer) or pancreatic adenocarcinoma. The latter is a leading cause of cancer death in the developed world and surgical resection is the only potential treatment, although many patients are not candidates for surgery. Although external-beam gamma radiation and chemotherapy are used to treat patients with non-operable pancreatic tumours, and survival rates can be improved by combined radio- and chemotherapy, there is still a clear need for novel treatment modalities for pancreatic cancer.

A new project at CERN called MEDICIS aims to develop non-conventional isotopes to be used as a diagnostic agent and for brachytherapy or unsealed internal radiotherapy for the treatment of non-resectable brain and pancreatic cancer, among other forms of the disease. Initiated in 2010, the facility will use a proton beam at ISOLDE to produce isotopes that first will be destined for hospitals and research centres in Switzerland, followed by a progressive roll-out to a larger network of laboratories in Europe and beyond. The project is now approaching its final phase, with start-up foreseen in June 2017.

A century of treatment

The idea of using radioisotopes to cure cancer was first proposed by Pierre Curie soon after his discovery of radium in 1898. The use of radium seduced many physicians because the penetrating rays could be used superficially or be inserted surgically into the body – a method called brachytherapy. The first clinical trials took place at the Curie Institute in France and at St Luke’s Memorial Hospital in New York at the beginning of the 20th century, for the treatment of prostate cancer.

A century later, in 2013, a milestone was met with the successful clinical trials of 223Ra in the form of the salt-solution RaCl2, which was injected into patients suffering from prostate cancers with bone metastasis. The positive effect on patient survival was so clear in the last clinical validation (so-called phase III), that the trial was terminated prematurely to allow patients who had received a placebo to be given the effective drug. Today, the availability of new isotopes, medical imagery, robotics, monoclonal antibodies and a better understanding of tumour mechanisms has enabled progress in both brachytherapy and unsealed internal radiotherapy. Radioisotopes can now be placed closer to and even inside the tumour cells, killing them with minimal damage to healthy tissue.

CERN-MEDICIS aims to further advance this area of medicine. New isotopes with specific types of emission, tissue penetration and half-life will be produced and purified based on expertise acquired during the past 50 years in producing beams of radioisotope ions for ISOLDE’s experimental programme. Diagnosis by single photon emission computed tomography (SPECT), a form of scintigraphy, covers the vast majority of worldwide isotope consumption based on the gamma-emitting 99mTc, which is used for functional probing of the brain and various other organs. PET protocols are increasingly used based on the positron emitter 18F and, more recently, a 68Ga compound. Therapy, on the other hand, is mostly carried out with beta emitters such as 131I, more recently with 177Lu, or with 223Ra for the new application of targeted alpha therapy. Other isotopes also offer clear benefits, such as 149Tb, which is the lightest alpha-emitting radiolanthanide and also combines positron-emitting properties.

Driven by ISOLDE

With 17 Member States and an ever-growing number of users, ISOLDE is a dynamic facility that has provided beams for around 300 experiments at CERN in its 50 year history. It allows researchers to explore the structure of the atomic nucleus, study particle physics at low energies, and provides radioactive probes for solid-state and biophysics. Through 50 years of collaboration between the technical teams and the users, a deep bond has formed, and the facility evolves hand-in-hand with new technologies and research topics.

CERN MEDICIS is the next step in this adventure, and the user community is joining in efforts to push the development of the machine in a new direction. The project was initiated six years ago by a relatively small collaboration involving CERN, KU Leuven, EPFL and two local University Hospitals (CHUV in Lausanne and HUG in Geneva). One year later, in 2011, CERN decided to streamline medical production of radioisotopes and to offer grants dedicated to technology transfer. While the mechanical conveyor system to transport the irradiated targets was covered by such a grant, the construction of the CERN MEDICIS building began in September 2013. The installation of the services, mass separator and laboratory is now under way.

At ISOLDE, physicists direct a high-energy proton beam from the Proton Synchrotron Booster (PSB) at a target. Since the beam loses only 10% of its intensity and energy on hitting the target, the particles that pass through it can still be used. For CERN-MEDICIS, a second target therefore sits behind the first and is used for exotic isotope generation. Key to the project is a mechanical system that transports a fresh target and its ion source into one of the two ISOLDE target-stations’ high resolution separator (HRS) beam dump, irradiates it with the proton beam from the PSB to generate the isotopes, then returns it to the CERN-MEDICIS laboratory. The system was fully commissioned in 2014 under proton-beam irradiation with a target that was later used to produce a secondary beam, thus validating the full principle. A crucial functional element was still missing: the isotope mass separator, along with its services and target station. Coincidentally, however, CERN MEDICIS started just as the operation of KU Leuven’s isotope-separation facility ended, and a new lease of life could therefore be given to its dipole magnet separator, which was delivered to CERN earlier this year for testing and refurbishment.

A close collaboration is growing at MEDICIS centred around the core team at CERN but involving partners from fundamental nuclear physics, material science, radiopharmacy, medical physics, immunology, radiobiology, oncology and surgery, with more to come.

Training network

With such an exceptional tool at hand, and based on growing pre-clinical research experiments performed at local university hospitals, in 2014 a H2020 Innovative Training Network was set up by CERN to ensure MEDICIS is fully exploited. This “Marie Skłodowska-Curie actions” proposal was submitted to the European Commission entitled MEDICIS-Promed, which stands for MEDICIS-produced radioisotope beams for medicine. The goal of this 14-institution consortium is to train a new generation of scientists to develop systems for personalised medicine combining functional imaging and treatments based on radioactive ion-beam mass separation. Subsystems for the development of new radiopharmaceuticals, isotope mass separators at medical cyclotrons, and of mass-separated 11Carbon for PET-aided hadron therapy are to be specifically developed to treat ovarian cancer. Pre-clinical experiments have already started, with the first imaging studies ever done with these exotic radioisotopes. For this, a specific ethical review board has been implemented within the consortium and is chaired by independent members.

With the MEDICIS facility entering operation next year, an increasing range of innovative isotopes will progressively become accessible. These will be used for fundamental studies in cancer research, for new imaging and therapy protocols in cell and animal models, and for pre-clinical trials – possibly extended to early phase clinical studies up to Phase I trials. During the next few years, 500 MBq isotope batches purified by electromagnetic mass separation combined with chemical methods will be collected on a weekly basis. This is a step increase in production to make these innovative isotopes more available to biomedical research laboratories, compared with the present production of a few days per year in a facility such as ISOLDE.

Staged production

During its initial stage in 2017, only low-Z materials, such as titanium foils and Y2O3 ceramics, will be used as targets. From these, we will produce batches of several hundred MBq of 44,47Sc and 61,64Cu. In the second stage, tentatively scheduled for 2018, we will use targets from the nuclei of higher atomic numbers, such as tantalum foils, to reach some of the most interesting terbium and lanthanide isotopes. In a final phase in 2018, we foresee the use of uranium and thorium targets to reach an even wider range of isotopes and most of the other alpha-emitters.

Selected isotopes will first be tested in vitro for their capacity to destroy glioblastoma or pancreatic adenocarcinoma or neudoendocine tumour cells, and in vivo by using mouse models of cancer. We will also test the isotopes for their direct effect on tumours and when they are coupled to peptides with tumour-homing capacities. New delivery methods for brachytherapy using stereotactic, endoscopic ultrasonographic-guided or robotic-assisted surgery will be established in large-animal models.

Moreover, this new facility marks the entrance of CERN into the era of theranostics. This growing oncological field allows nuclear-medicine physicians to verify and quantify the presence of cellular and molecular targets in a given patient with the diagnostic radioisotope, before treating the disease with the therapeutic radioisotope. The prospect of a dedicated facility at CERN for the production of innovative isotopes, together with local leading institutes in life and medical sciences and a large network of laboratories, makes this an exciting scientific programme in the coming years.

The post CERN to produce radioisotopes for health appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-to-produce-radioisotopes-for-health/feed/ 0 Feature
Energetic protons boost BNL isotope production https://cerncourier.com/a/energetic-protons-boost-bnl-isotope-production/ https://cerncourier.com/a/energetic-protons-boost-bnl-isotope-production/#respond Fri, 16 Sep 2016 13:55:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/energetic-protons-boost-bnl-isotope-production/ Accelerator upgrades will help to meet the demand for strontium-82.

The post Energetic protons boost BNL isotope production appeared first on CERN Courier.

]]>
The mission of the US Department of Energy (DOE) isotope programme is to produce and distribute radioisotopes that are in short supply and in high demand for medical, industrial and environmental uses. The DOE programme also maintains the unique infrastructure of national laboratories across the country, one of which is Brookhaven National Laboratory’s medical radioisotope programme, MIRP. Although there are many small accelerators in the US that produce radioisotopes, the availability of proton energies up to 200 MeV from the Brookhaven Linac Isotope Producer (BLIP) is unique.

There is significant promise for treating a variety of diseases including metastatic cancer, viral and fungal infections and even HIV

Radioisotopes are of interest both for nuclear medicine and for diagnostic imaging and therapy. The most important aspect of Brookhaven’s isotope programme is the large-scale production and supply of clinical-grade strontium-82 (82Sr). Although 82Sr is not directly used in humans, its short-lived daughter product 82Rb is a potassium mimic and upon injection is rapidly taken up by viable cardiac tissue. It is therefore supplied to hospitals as a generator for positron emission tomography (PET) scans of the heart, where its short half-life (76 seconds) allows multiple scans to be performed and minimal doses delivered to the patient. At present, up to 350,000 patients per year in the US receive such PET scans, but demand is growing beyond capacity.

There is also significant promise for the utilisation of alpha emitters for treating a variety of diseases including metastatic cancer, viral and fungal infections and even HIV, for which the leading candidate is the alpha-emitter 225Ac. Thanks to a series of upgrades completed this year, Brookhaven is now in a position to boost production of both of these vital medical isotopes.

Protons on target

The BLIP was built in 1972 and was the world’s first facility to utilise high-energy, high-current protons for radioisotope production. It works by diverting the excess beam of Brookhaven’s 200 MeV proton linac to water-cooled target assemblies that contain specially engineered targets and degraders to allow optimal energy to be delivered to the targets. The use of higher-energy particles allows relatively thick targets to be irradiated, in which the large number of target nuclei compensate for the generally smaller reaction cross-sections compared to low-energy nuclear reactions.

CCfea34_08_16

Although the maximum proton energy is 200 MeV, lower energies can be delivered by sequentially turning off the accelerating sections to achieve 66, 92, 117, 139, 160, 181 and 200 MeV beams. This is the only linac with such a capability, and its energy and intensity can be controlled on a pulse-by-pulse basis. As a result, the linac can simultaneously supply high-intensity pulses to the BLIP and a low-intensity polarised proton beam to the booster synchrotron for injection into the Alternating Gradient Synchrotron (AGS) and the Relativistic Heavy Ion Collider (RHIC) for Brookhaven’s nuclear-physics programme. This shared use allows for cost-effective operation. The BLIP design also enables bombardment of up to eight targets, offering the unique ability to produce multiple radioisotopes at the same time (see table). Target irradiations for radiation-damage studies are also performed, including for materials relevant to collimators used at the LHC and Fermilab.

The Gaussian beam profile of the linac results in very high power density in the target centre. Until recently, the intensity of the beam was limited to 115 μA to ensure the survival of the target. This year, however, a raster system was installed that allows the current on the target to be increased by allowing a more uniform deposition of the beam across the target. This system requires rapid cycling magnets and power supplies to continuously move the beam spot, and has been fully operational since January 2016.

Production of 82Sr is accomplished by irradiating a target comprising rubidium-chloride salt with 117 MeV protons, with the raster parameters driven by the thermal properties of the target. This demanded diagnostic devices in the BLIP beamline that enable the profile of the beam spot to be measured, both for initial device tuning and commissioning and for routine monitoring. These included a laser-profile monitor, beam-position monitor and plunging multi-wire devices. It was also necessary to build an interlock system to detect raster failure, because the target could be destroyed rapidly if the smaller-diameter beam spot stopped moving. The beam is moved in a circular pattern at a rate of 5 kHz with two different radii to create one large and one smaller circle. The radius values and the number of beam pulses for each radius can be programmed to optimise the beam distribution, allowing a five-fold reduction in peak power density.

Given the resulting increase in current from these upgrades, a parallel effort was required to increase the linac-beam intensity. This was accomplished by extending the present pulse length by approximately five per cent and optimising low-energy beam-transport parameters. These adjustments have now raised the maximum beam current to 173 μA, boosting radioisotope production by more than a third. After irradiation, all targets need to be chemically processed to purify the radioisotope of interest from target material and all other coproduced radioisotopes, which is carried out at Brookhaven’s dedicated target-processing laboratory.

Tri-lab effort

Among the highest-priority research efforts of the MIRP is to assess the feasibility of using an accelerator to produce the alpha emitter 225Ac. Alpha particles impart a high dose in a very short path length, which means that high doses to abnormal diseased tissues can be delivered while limiting the dose to normal tissues. Although there have been several promising preclinical and clinical trials of alpha emitters in the US and Europe, the 10 day half-life of 225Ac would enable targeted alpha radiotherapy using large proteins such as monoclonal antibodies and peptides for selective treatments of metastatic disease. 225Ac decays through multiple alpha emissions to 213Bi, which is an alpha emitter with a half-life of 46 minutes and can therefore be used with peptides and small molecules for rapid targeted alpha therapy.

CCfea35_08_16

Although 225Ac is the leading-candidate alpha emitter, vital research has been hindered by its very limited availability. To accelerate this development, a formal “Tri-Lab” collaboration has been established between BNL and two other DOE laboratories: Los Alamos National Laboratory (LANL) and Oak Ridge National Laboratory (ORNL). The aim is to evaluate the feasibility of irradiating thorium targets with high-energy proton beams to produce much larger quantities of 225Ac for medical applications. Because there is a direct correlation between beam intensity and radioisotope yields, the higher the intensity the higher the yield of these and other useful isotopes. So far, BNL and LANL have measured cross-sections, developed and irradiated relevant alpha-emitter targets for shipment to ORNL and other laboratories. These include several targets containing 225Ac-radioactivity up to 5.9 GBq and others for chemical and biological evaluation of both direct 225Ac use as well as use of a generator to provide the shorter-lived 213Bi. Similar irradiation methods are available at LANL and also TRIUMF in Canada.

Irradiation of thorium metal at high energy also creates copious fission products. This complicates the chemical purification but also creates an opportunity because some coproduced radiometals are of interest for other medical applications. The BNL group therefore plans to develop and evaluate methods to extract these from the irradiated-thorium target in a form suitable for use. In addition to 225Ac, the BNL programme is evaluating the future production of other radioisotopes that can be used as “theranostics”. This term refers to isotope pairs or even the same radioisotope that can be used for both imaging and therapeutic applications. Among the potentially attractive isotopes for this purpose that can be produced at BLIP are the beta- and gamma-emitters 186Re and 47Sc.

BNL has served as the birthplace for nuclear medicine from the 1950s, and saw the first use of high-intensity, high-power beams for radioisotope production. Under the guidance of the DOE isotope programme, the laboratory is using its unique accelerator facilities to develop and supply radioisotopes for imaging and therapy. Completed and future upgrades will enable large-scale production of alpha emitters and theranostics to meet presently unmet clinical need. These will enable personalised patient treatments and overall improvements in patient health and quality of life.

The post Energetic protons boost BNL isotope production appeared first on CERN Courier.

]]>
https://cerncourier.com/a/energetic-protons-boost-bnl-isotope-production/feed/ 0 Feature Accelerator upgrades will help to meet the demand for strontium-82. https://cerncourier.com/wp-content/uploads/2016/09/CCfea33_08_16.jpg
TRIUMF targets alpha therapy https://cerncourier.com/a/triumf-targets-alpha-therapy/ https://cerncourier.com/a/triumf-targets-alpha-therapy/#respond Fri, 16 Sep 2016 12:55:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/triumf-targets-alpha-therapy/ External-beam radiation therapy is used routinely to treat many different types of cancerous tumours, delivering a targeted dose of radiation to cancer cells while sparing surrounding healthy tissue as much as possible. While there have been dramatic improvements in the control of patient and tumour dose during recent years, many challenges persist. These include side […]

The post TRIUMF targets alpha therapy appeared first on CERN Courier.

]]>

External-beam radiation therapy is used routinely to treat many different types of cancerous tumours, delivering a targeted dose of radiation to cancer cells while sparing surrounding healthy tissue as much as possible. While there have been dramatic improvements in the control of patient and tumour dose during recent years, many challenges persist. These include side effects such as depressed immunity, which makes patients susceptible to post-treatment infections, and an increase in secondary cancers.

An alternative approach involves delivering a therapeutic radiation dose to tumour cells selectively through a strategy similar to that for molecular imaging: therapeutic isotopes are incorporated into complex pharmaceuticals for specific, targeted delivery of a potent radiation dose directly to cancerous cells. This approach has been recognised since the time of Madame Curie, but even after a century of development, this application remains woefully unoptimised.

To study the full potential of radionuclide therapy, the medical research community is increasingly demanding therapeutic alpha- and beta-emitting isotopes to treat advanced metastatic cancer and other diffuse diseases. Such therapeutic isotopes are changing the cancer-treatment landscape, yet lack of availability and cost are significantly affecting further research and development.

Targeted radionuclide therapy

Targeted radionuclide therapy (TRT) involves the injection of particle-emitting radionuclides appended to a biological targeting molecule, which direct a lethal dose of radiation to a specific site within the body. The short range and highly cytotoxic nature of alpha and beta particles destroys small, diffuse and post-operative residual tumours while minimising damage to healthy tissue. TRT’s strength lies in the diversity and adaptability of both isotopes and targeting molecules, which include monoclonal antibodies, antibody fragments, nanoparticles, and small peptides and molecules. Because this allows an optimal delivery regimen to be developed for each application, TRT isotopes are generating significant interest internationally.

Within the Life Sciences Division of TRIUMF in Vancouver, Canada, TRT is now an active research effort. The goal is to exploit TRIUMF’s production and radiochemistry capabilities to enable fundamental and applied research with a spectrum of isotopes across different disciplines. In the near-to-medium term, TRIUMF plans to develop platform technologies to enable accelerator-based radiometallic isotope production and applications beyond the current state-of-the-art. Access to a host of metallic isotopes will allow TRIUMF to leverage its radiochemistry expertise to demonstrate the synthesis of novel radiopharmaceuticals, including TRT drugs.

Alpha therapy in sight

Targeted alpha therapy (TAT) is a type of TRT that exploits the high linear-energy transfer of alpha particles (figure 1) to maximise tumour-cell destruction while minimising damage to surrounding cells. As such, TAT has tremendous potential to become a very powerful, selective tool for personalised cancer treatments. To fulfil its promise, however, TAT relies heavily on new developments in isotope production. It also demands organic, bioinorganic and organometallic synthesis techniques to create new molecular probes, and novel techniques to address the stability of metal complexes in vivo.

Several promising alpha-emitting radionuclides are currently under consideration worldwide – including 149Tb, 211At, 212Bi, 212Pb,  213Bi, 223Ra, 225Ac, 226Th and 230U – and very promising results have already emerged from clinical and pre-clinical studies of TAT agents. Progress at several laboratories is fuelling great optimism in the medical community. For example, the US Food and Drug Administration recently approved the use of the alpha emitter 223RaCl2 (registered under the trademark Xofigo) for pain relief from bone metastases, and several other TAT drugs are in the clinical-trial pipeline.

Securing a constant supply of clinically relevant amounts of alpha-emitting radionuclides remains a challenge, since their production requires high-Z targets and a complex infrastructure. “Generator systems” are a convenient source of TAT isotopes: for example, 225Ac (which has a half-life of 9.92 days) can be harvested as a decay product of 229Th. Because the global quantity of 229Th is not being replenished and the 229Th/225Ac generator can only be eluted every nine weeks, annual worldwide production is limited to approximately 1.7 curies. Several alternative strategies are therefore being proposed to produce such isotopes directly.

TAT radionuclides must be carefully processed before being used in medical applications. They first must be isolated with high radio-chemical purity from the target material, which can be achieved using classical chemical procedures such as ion exchange, extraction and precipitation. Purified TAT radionuclides are then attached to biomolecule targeting vectors via a bifunctional chelator, which connects the biomolecule with a radionuclide complex (figure 2). The stability of compounds containing alpha-emitting radionuclides is a challenge because after decay most of the daughter isotopes are radioactive elements that no longer remain chelated. Moreover, the radioactive daughters can accumulate and cause unwanted toxicity in healthy organs, especially those involved in excretion such as the liver and kidneys. These issues have driven demand for a more robust and stable chelation system and/or encapsulation methods that contribute to an optimised pharmacokinetic profile with rapid cell internalisation. By doing so, the hope is to keep radioactive daughter nuclei proximal to the original decay site and thus close to the targeted tissue.

Several clinical trials with alpha-emitting radionuclides – including 225Ac (phase II trial) and 213Bi (phase III) – are under way around the world, based on the standard chelation approach. Despite the challenges involved, these trials are already showing extremely high promise and superiority over existing beta-emitting radionuclides. Further research is therefore warranted to investigate and optimise various production strategies designed to make TAT a viable clinical modality. The TAT isotope 225Ac has demonstrated particularly high potential in recent years because its half-life correlates well with the biological half-lives of intact antibodies, and its multiple alpha-emitting daughters enhance the therapeutic effect. 225Ac also can be used as a parent radionuclide for a 225Ac/213Bi generator system.

TRIUMF’s strategy

TRIUMF has extensive expertise in all aspects of the production of medical isotopes, including the development of high-powered targets for large-scale production and expertise in isotope-production simulations with its existing Monte Carlo code FLUKA and the new Geant4. TRIUMF’s strategy involves using both existing and new proton beamlines from its 520 MeV cyclotron, along with a newly built 30 MeV electron linac in the upcoming Advanced Rare IsotopE Laboratory (ARIEL) facility, to irradiate thorium and uranium targets to produce a variety of radiometals. These include 225Ra and 224Ra, which are parent isotopes for the daughter products 225Ac, 212,213Bi and 212Pb. Because these targets can be positioned downstream from the science targets, the symbiotic production of these radiometals is limited only by the beam intensity.

Under the envisioned operating conditions of the new proton beamline, FLUKA simulations of the ARIEL proton target station predict yields of several-hundred millicuries of 225Ac per irradiation and significant quantities of other isotopes. While only very small quantities of 225Ac are required for radionuclide therapy, larger quantities are required to produce enough 213Bi in those treatments where it’s preferred. A larger demand for 213Bi will then drive a similarly increased demand for 225Ac to provide adequate 225Ac/213Bi generators. Thus, TRIUMF’s emerging production capacity would yield sufficient 225Ac to enable the assembly of multiple 225Ac/213Bi generators for therapeutic research studies in patients at multiple centres. Based on typical operating-schedule estimates, this technique could result in the production of several curies of 225Ac per year, compared to the current global output of 1 to 2 curies per year, making the proposed infrastructure a potentially potent source of this valuable isotope. Furthermore, many other medically relevant radioisotopes apart from 225Ac are produced from a thorium or uranium target. The higher current proton beam at ARIEL will enable TRIUMF researchers to explore this exciting medical isotope further.

The ultimate goal of TRIUMF’s TRT programme is to carry out clinical testing and establish the efficacy of TRT agents, enabling a national and possibly international clinical-trial programme for promising therapeutics. TRIUMF research partners will develop new radiopharmaceuticals incorporating therapeutic nuclei into targeting molecules, producing therapeutic conjugates that are used to shepherd their targeted payload to tumours. In addition, research will be carried out to design new molecules that can be used to target different types of tumours.

By leveraging TRIUMF’s existing infrastructure and established research partnerships, the medical community can look forward to production of higher quantities of TRT isotopes. Should the promising results seen to date materialise into a viable treatment option for late stage and/or currently untreatable cancers, the results will bring new hope for a significant number of cancer patients worldwide.

The post TRIUMF targets alpha therapy appeared first on CERN Courier.

]]>
https://cerncourier.com/a/triumf-targets-alpha-therapy/feed/ 0 Feature
Reactors and accelerators join forces https://cerncourier.com/a/viewpoint-reactors-and-accelerators-join-forces/ Fri, 16 Sep 2016 12:55:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-reactors-and-accelerators-join-forces/ Demand for medical isotopes requires reactor- and accelerator-based production methods.

The post Reactors and accelerators join forces appeared first on CERN Courier.

]]>

Nuclear reactors are usually thought of in the context of electricity generation, whereby heat generated by nuclear fission produces steam to drive an alternator. A less well-known class of nuclear-fission reactors fulfils an entirely different societal goal. Known as research and test reactors, the heat they produce is a by-product, while the neutrons resulting from the fission reactions are used to irradiate materials or as probes for materials science. In some reactors, neutrons are used to transmute stable isotopes into radioactive ones, which are subsequently utilised for industrial or medical purposes.

Used in diagnostics and treatment, medical radioisotopes are a vital tool in the arsenal of oncologists in detecting and fighting cancer. In the case of 99mTc, which is a daughter product of 99Mo, roughly 30  million patients per year are injected with this isotope. This accounts for 80% of all nuclear-medicine diagnostic procedures, and demand is only growing as more of the global population gain access to advanced medicine. Classically, 99Mo is produced as a fission product in uranium targets: after irradiation lasting around one week, the targets are rushed off to the processing facility where the 99Mo is extracted. Since its half-life is only around six days, there is no way to stock up on the isotope, and therefore a continuous chain of target production, irradiation, isotope extraction and purification – and finally supply to hospitals – is required.

The importance of a steady supply of medical radioisotopes such as 99Mo cannot be overestimated, yet it is generally not possible to cover the cost of operating a large research reactor or other facility solely for the production of radioisotopes, and the yield needs to be sufficiently high for such a production to even significantly reduce the cost. Traditionally, the economics of constructing an accelerator facility for the sole purpose of generating 99Mo have been challenging, especially since the fission yield of 99Mo outweighs the possible yields from non-reactor methods by at least a factor of 10. Recently, however, a reduction in the construction costs of high-power accelerators and the increasing costs associated with operating reactors has generated interest in accelerator-based production of 99Mo, for example via semi-commercial initiatives such as SHINE and NorthStar in the US. 

One of the driving forces behind these developments is the ageing of existing research reactors. The global supply of 99Mo mainly originates in a handful of reactors such as the BR2 in Belgium, the NRU in Canada or the HFR in the Netherlands, and most of them are more than 50 years old. The NRU, which alone is responsible for about a third of the global demand of 99Mo, is scheduled to cease production this year. Some reactors are still planned to continue operation for multiple decades (such as OPAL in Australia, SAFARI in South Africa and BR2), while smaller research reactors such as MARIA in Poland and LVR-15 in the Czech Republic are getting increasingly involved in radioisotope production and new research reactors are being contemplated: MYRRHA in Belgium, PALLAS in the Netherlands and JHR in France (for which construction is ongoing), for instance. Despite these developments, it is uncertain if the rising demand can continue to be met without assistance from accelerator-based production.

Neutrons are very suitable for isotope production because the cross-sections for neutron-induced nuclear reactions are often much larger than those for charged particles. As such, there is an advantage in using the neutrons already available at research reactors for isotope production. But it is clear that accelerators and reactors are highly complementary. Reactors generate neutron-rich isotopes through fission or activation, whereas accelerators typically allow the production of proton-rich isotopes. Alpha emitters are also becoming more popular in nuclear medicine, particularly in palliative care, and the role of accelerators will likely become more important in the future production of such isotopes. It is therefore healthy to maintain multiple production routes open for such vital and rare products, on which people’s lives can depend.

The post Reactors and accelerators join forces appeared first on CERN Courier.

]]>
Opinion Demand for medical isotopes requires reactor- and accelerator-based production methods. https://cerncourier.com/wp-content/uploads/2016/09/CCvie40_08_16.jpg
SESAME announces call for proposals https://cerncourier.com/a/sesame-announces-call-for-proposals/ https://cerncourier.com/a/sesame-announces-call-for-proposals/#respond Fri, 12 Aug 2016 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/sesame-announces-call-for-proposals/ SESAME, the pioneering synchrotron facility for the Middle East and neighbouring countries, located in Jordan, has announced its first call for proposals for experiments. A third-generation light source with a broad research capacity, SESAME’s first beams are due to circulate in the autumn and its experimental programme is scheduled to start in 2017. SESAME is […]

The post SESAME announces call for proposals appeared first on CERN Courier.

]]>

SESAME, the pioneering synchrotron facility for the Middle East and neighbouring countries, located in Jordan, has announced its first call for proposals for experiments. A third-generation light source with a broad research capacity, SESAME’s first beams are due to circulate in the autumn and its experimental programme is scheduled to start in 2017. SESAME is already host to a growing user community of some 300 scientists from across the region and is open to proposals for the best science, wherever they may come from.

SESAME will start up with two beamlines, one delivering infrared light and the other X-rays. The laboratory’s full scientific programme will span fields ranging from medicine and biology, through materials science, physics and chemistry to healthcare, the environment, agriculture and archaeology. Proposals can be submitted through the SESAME website (www.sesame.org.jo) and will be examined by a proposal-review committee.

“This is a very big moment for SESAME,” says SESAME director-general Khaled Toukan. “It signals the start of the research programme at the first international synchrotron research facility in our region.”

The post SESAME announces call for proposals appeared first on CERN Courier.

]]>
https://cerncourier.com/a/sesame-announces-call-for-proposals/feed/ 0 News
New super-heavy elements find names https://cerncourier.com/a/new-super-heavy-elements-find-names/ https://cerncourier.com/a/new-super-heavy-elements-find-names/#respond Fri, 12 Aug 2016 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/new-super-heavy-elements-find-names/ The International Union of Pure and Applied Chemistry (IUPAC) has announced the provisional names of four new super-heavy elements that complete the seventh row of the periodic table. The researchers responsible for the discoveries, which were made in Japan, Russia and the US during the past decade, proposed the following names for peer review: nihonium […]

The post New super-heavy elements find names appeared first on CERN Courier.

]]>

The International Union of Pure and Applied Chemistry (IUPAC) has announced the provisional names of four new super-heavy elements that complete the seventh row of the periodic table. The researchers responsible for the discoveries, which were made in Japan, Russia and the US during the past decade, proposed the following names for peer review: nihonium (Nh) for element 113; moscovium (Mc) for 115; tennessine (Ts) for 117; and oganesson (Og) for 118.

Having reviewed the proposals and recommended them for acceptance, the IUPAC Inorganic Chemistry Division set in motion a five-month public review that will come to an end on 8 November, prior to formal approval by the IUPAC Council. Keeping with tradition, newly discovered elements can be named after a mythological concept or character (including an astronomical object); a mineral or similar substance; a place or geographical region; a property of the element; or a scientist.

In conjunction with the International Union of Pure and Applied Physics, IUPAC has also attached priority to the discovery claims. Element 113 was discovered by a collaboration at RIKEN in Japan, while elements 115 and 117 were synthesised at the U-400 accelerator complex at the Joint Institute for Nuclear Research (JINR) in Dubna, Russia, via a collaboration with the Lawrence Livermore National Laboratory (LLNL) and Oak Ridge National Laboratory in the US. The discovery of element 118 was attributed to a JINR–LLNL collaboration, which in 2011 was also acknowledged by IUPAC for the discovery of elements 114 (flerovium) and 116 (livermorium).

The post New super-heavy elements find names appeared first on CERN Courier.

]]>
https://cerncourier.com/a/new-super-heavy-elements-find-names/feed/ 0 News
MAX IV paves the way for ultimate X-ray microscope https://cerncourier.com/a/max-iv-paves-the-way-for-ultimate-x-ray-microscope/ https://cerncourier.com/a/max-iv-paves-the-way-for-ultimate-x-ray-microscope/#comments Fri, 12 Aug 2016 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/max-iv-paves-the-way-for-ultimate-x-ray-microscope/ Novel machine lattice produces brightest ever X-ray beams.

The post MAX IV paves the way for ultimate X-ray microscope appeared first on CERN Courier.

]]>
CCmax1_07_16

Since the discovery of X-rays by Wilhelm Röntgen more than a century ago, researchers have striven to produce smaller and more intense X-ray beams. With a wavelength similar to interatomic spacings, X-rays have proved to be an invaluable tool for probing the microstructure of materials. But a higher spectral power density (or brilliance) enables a deeper study of the structural, physical and chemical properties of materials, in addition to studies of their dynamics and atomic composition.

For the first few decades following Röntgen’s discovery, the brilliance of X-rays remained fairly constant due to technical limitations of X-ray tubes. Significant improvements came with rotating-anode sources, in which the heat generated by electrons striking an anode could be distributed over a larger area. But it was the advent of particle accelerators in the mid-1900s that gave birth to modern X-ray science. A relativistic electron beam traversing a circular storage ring emits X-rays in a tangential direction. First observed in 1947 by researchers at General Electric in the US, such synchrotron radiation has taken X-ray science into new territory by providing smaller and more intense beams.

Generation game

First-generation synchrotron X-ray sources were accelerators built for high-energy physics experiments, which were used “parasitically” by the nascent synchrotron X-ray community. As this community started to grow, stimulated by the increased flux and brilliance at storage rings, the need for dedicated X-ray sources with different electron-beam characteristics resulted in several second-generation X-ray sources. As with previous machines, however, the source of the X-rays was the bending magnets of the storage ring.

The advent of special “insertion devices” led to present-day third-generation storage rings – the first being the European Synchrotron Radiation Facility (ESRF) in Grenoble, France, and the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory in Berkeley, California, which began operation in the early 1990s. Instead of using only the bending magnets as X-ray emitters, third-generation storage rings have straight sections that allow periodic magnet structures called undulators and wigglers to be introduced. These devices consist of rows of short magnets with alternating field directions so that the net beam deflection cancels out. Undulators can house 100 or so permanent short magnets, each emitting X-rays in the same direction, which boosts the intensity of the emitted X-rays by two orders of magnitude. Furthermore, interference effects between the emitting magnets can concentrate X-rays of a given energy by another two orders of magnitude.

Third-generation light sources have been a major success story, thanks in part to the development of excellent modelling tools that allow accelerator physicists to produce precise lattice designs. Today, there are around 50 third-generation light sources worldwide, with a total number of users in the region of 50,000. Each offers a number of X-ray beamlines (up to 40 at the largest facilities) that fan out from the storage ring: X-rays pass through a series of focusing and other elements before being focused on a sample positioned at the end station, with the longest beamlines (measuring 150 m or more) at the largest light sources able to generate X-ray spot sizes a few tens of nanometres in diameter. Facilities typically operate around the clock, during which teams of users spend anywhere between a few hours to a few days undertaking experimental shifts, before returning to their home institutes with the data.

Although the corresponding storage-ring technology for third-generation light sources has been regarded as mature, a revolutionary new lattice design has led to another step up in brightness. The MAX IV facility at Maxlab in Lund, Sweden, which was inaugurated in June, is the first such facility to demonstrate the new lattice. Six years in construction, the facility has demanded numerous cutting-edge technologies – including vacuum systems developed in conjunction with CERN – to become the most brilliant source of X-rays in the world.

The multi-bend achromat

CCmax2_07_16

Initial ideas for the MAX IV project started at the end of the 20th century. Although the flagship of the Maxlab laboratory, the low-budget MAX II storage ring, was one of the first third-generation synchrotron radiation sources, it was soon outcompeted by several larger and more powerful sources entering operation. Something had to be done to maintain Maxlab’s accelerator programme.

The dominant magnetic lattice at third-generation light sources consists of double-bend achromats (DBAs), which have been around since the 1970s. A typical storage ring contains 10–30 achromats, each consisting of two dipole magnets and a number of magnet lenses: quadrupoles for focusing and sextupoles for chromaticity correction (at MAX IV we also added octupoles to compensate for amplitude-dependent tune shifts). The achromats are flanked by straight sections housing the insertion devices, and the dimensions of the electron beam in these sections is minimised by adjusting the dispersion of the beam (which describes the dependence of an electron’s transverse position on its energy) to zero. Other storage-ring improvements, for example faster correction of the beam orbit, have also helped to boost the brightness of modern synchrotrons. The key quantity underpinning these advances is the electron-beam emittance, which is defined as the product of the electron-beam size and its divergence.

Despite such improvements, however, today’s third-generation storage rings have a typical electron-beam emittance of between 2–5 nm rad, which is several hundred times larger than the diffraction limit of the X-ray beam itself. This is the point at which the size and spread of the electron beam approaches the diffraction properties of X-rays, similar to the Abbe diffraction limit for visible light (see panel below). Models of machine lattices with even smaller electron-beam emittances predict instabilities and/or short beam lifetimes that make the goal of reaching the diffraction limit at hard X-ray energies very distant.

Although it had been known for a long time that a larger number of bends decreases the emittance (and therefore increases the brilliance) of storage rings, in the early 1990s, one of the present authors (DE) and others recognised that this could be achieved by incorporating a higher number of bends into the achromats. Such a multi-bend achromat (MBA) guides electrons around corners more smoothly, therefore decreasing the degradation in horizontal emittance. A few synchrotrons already employ triple-bend achromats, and the design has also been used in several particle-physics machines, including PETRA at DESY, PEP at SLAC and LEP at CERN, proving that a storage ring with an energy of a few GeV produces a very low emittance. To avoid prohibitively large machines, however, the MBA demands much smaller magnets than are currently employed at third-generation synchrotrons.

CCmax3_07_16

In 1995, our calculations showed that a seven-bend achromat could yield an emittance of 0.4 nm rad for a 400 m-circumference machine – 10 times lower than the ESRF’s value at the time. The accelerator community also considered a six-bend achromat for the Swiss Light Source and a five-bend achromat for a Canadian light source, but the small number of achromats in these lattices meant that it was difficult to make significant progress towards a diffraction-limited source. One of us (ME) took the seven-bend achromat idea and turned it into a real engineering proposal for the design of MAX IV. But the design then went through a number of evolutions. In 2002, the first layout of a potential new source was presented: a 277 m-circumference, seven-bend lattice that would reach an emittance of 1 nm rad for a 3 GeV electron beam. By 2008, we had settled on an improved design: a 520 m-circumference, seven-bend lattice with an emittance of 0.31 nm rad, which will be reduced by a factor of two once the storage ring is fully equipped with undulators. This is more or less the design of the final MAX IV storage ring.

In total, the team at Maxlab spent almost a decade finding ways to keep the lattice circumference at a value that was financially realistic, and even constructed a 36 m-circumference storage ring called MAX III to develop the necessary compact magnet technology. There were tens of problems that we had to overcome. Also, because the electron density was so high, we had to elongate the electron bunches by a factor of four by using a second radio-frequency (RF) cavity system.

Block concept

MAX IV stands out in that it contains two storage rings operated at an energy of 1.5 and 3 GeV. Due to the different energies of each, and because the rings share an injector and other infrastructure, high-quality undulator radiation can be produced over a wide spectral range with a marginal additional cost. The storage rings are fed electrons by a 3 GeV S-band linac made up of 18 accelerator units, each comprising one SLAC Energy Doubler RF station. To optimise the economy over a potential three-decade-long operation lifetime, and also to favour redundancy, a low accelerating gradient is used.

The 1.5 GeV ring at MAX IV consists of 12 DBAs, each comprising one solid-steel block that houses all the DBA magnets (bends and lenses). The idea of the magnet-block concept, which is also used in the 3 GeV ring, has several advantages. First, it enables the magnets to be machined with high precision and be aligned with a tolerance of less than 10 μm without having to invest in aligning laboratories. Second, blocks with a handful of individual magnets come wired and plumbed direct from the delivering company, and no special girders are needed because the magnet blocks are rigidly self-supporting. Last, the magnet-block concept is a low-cost solution.

We also needed to build a different vacuum system, because the small vacuum tube dimensions (2 cm in diameter) yield a very poor vacuum conductance. Rather than try to implement closely spaced pumps in such a compact geometry, our solution was to build 100% NEG-coated vacuum systems in the achromats. NEG (non-evaporable getter) technology, which was pioneered at CERN and other laboratories, uses metallic surface sorption to achieve extreme vacuum conditions. The construction of the MAX IV vacuum system raised some interesting challenges, but fortunately CERN had already developed the NEG coating technology to perfection. We therefore entered a collaboration that saw CERN coat the most intricate parts of the system, and licences were granted to companies who manufactured the bulk of the vacuum system. Later, vacuum specialists from the Budker Institute in Novosibirsk, Russia, mounted the linac and 3 GeV-ring vacuum systems.

Due to the small beam size and high beam current, intra beam scattering and “Touschek” lifetime effects must also be addressed. Both are due to a high electron density at small-emittance/high-current rings in which electrons are brought into collisions with themselves. Large energy changes among the electrons bring some of them outside of the energy acceptance of the ring, while smaller energy deviations cause the beam size to increase too much. For these reasons, a low-frequency (100 MHz) RF system with bunch-elongating harmonic cavities was introduced to decrease the electron density and stabilise the beam. This RF system also allows powerful commercial solid-state FM-transmitters to be used as RF sources.

CCmax4_07_16

When we first presented the plans for the radical MAX IV storage ring in around 2005, people working at other light sources thought we were crazy. The new lattice promised a factor of 10–100 increase in brightness over existing facilities at the time, offering users unprecedented spatial resolutions and taking storage rings within reach of the diffraction limit. Construction of MAX IV began in 2010 and commissioning began in August 2014, with regular user operation scheduled for early 2017.

On 25 August 2015, an amazed accelerator staff sat looking at the beam-position monitor read-outs at MAX IV’s 3 GeV ring. With just the calculated magnetic settings plugged in, and the precisely CNC-machined magnet blocks, each containing a handful of integrated magnets, the beam went around turn after turn with proper behaviour. For the 3 GeV ring, a number of problems remained to be solved. These included dynamic issues – such as betatron tunes, dispersion, chromaticity and emittance – in addition to more trivial technical problems such as sparking RF cavities and faulty power supplies.

As of MAX IV’s inauguration on 21 June, the injector linac and the 3 GeV ring are operational, with the linac also delivering X-rays to the Short Pulse Facility. A circulating current of 180 mA can be stored in the 3 GeV ring with a lifetime of around 10 h, and we have verified the design emittance with a value in the region of 300 pm rad. Beamline commissioning is also well under way, with some 14 beamlines under construction and a goal to increase that number to more than 20.

Sweden has a well-established synchrotron-radiation user community, although around half of MAX IV users will come from other countries. A variety of disciplines and techniques are represented nationally, which must be mirrored by MAX IV’s beamline portfolio. Detailed discussions between universities, industry and the MAX IV laboratory therefore take place prior to any major beamline decisions. The high brilliance of the MAX IV 3 GeV ring and the temporal characteristics of the Short Pulse Facility are a prerequisite for the most advanced beamlines, with imaging being one promising application.

Towards the diffraction limit

MAX IV could not have reached its goals without a dedicated staff and help from other institutes. As CERN has helped us with the intricate NEG-coated vacuum system, and the Budker Institute with the installation of the linac and ring vacuum systems, the brand new Solaris light source in Krakow, Poland (which is an exact copy of the MAX IV 1.5 GeV ring) has helped with operations, and many other labs have offered advice. The MAX IV facility has also been marked out for its environmental credentials: its energy consumption is reduced by the use of high-efficiency RF amplifiers and small magnets that have a low power consumption. Even the water-cooling system of MAX IV transfers heat energy to the nearby city of Lund to warm houses.

The MAX IV ring is the first of the MBA kind, but several MBA rings are now in construction at other facilities, including the ESRF, Sirius in Brazil and the Advanced Photon Source (APS) at Argonne National Laboratory in the US. The ESRF is developing a hybrid MBA lattice that would enter operation in 2019 and achieve a horizontal emittance of 0.15 nm rad. The APS has decided to pursue a similar design that could enter operation by the end of the decade and, being larger than the ESRF, the APS can strive for an even lower emittance of around 0.07 nm rad. Meanwhile, the ALS in California is moving towards a conceptual design report, and Spring-8 in Japan is pursuing a hybrid MBA that will enter operation on a similar timescale.

CCmax5_07_16

Indeed, a total of some 10 rings are currently in construction or planned. We can therefore look forward to a new generation of synchrotron storage rings with very high transverse-coherent X-rays. We will then have witnessed an increase of 13–14 orders of magnitude in the brightness of synchrotron X-ray sources in a period of seven decades, and put the diffraction limit at high X-ray energies firmly within reach.

One proposal would see such a diffraction-limited X-ray source installed in the 6.3 km-circumference tunnel that once housed the Tevatron collider at Fermilab, Chicago. Perhaps a more plausible scenario is PETRA IV at DESY in Hamburg, Germany. Currently the PETRA III ring is one of the brightest in the world, but this upgrade (if it is funded) could bring the ring performance to the diffraction limit at hard X-ray energies. This is the Holy Grail of X-ray science, providing the highest resolution and signal-to-noise ratio possible, in addition to the lowest-radiation damage and the fastest data collection. Such an X-ray microscope will allow the study of ultrafast chemical reactions and other processes, taking us to the next chapter in synchrotron X-ray science.

Towards the X-ray diffraction limit

Electromagnetic radiation faces a fundamental limit in terms of how sharply it can be focused. For visible light, it is called the Abbe limit, as shown by Ernst Karl Abbe in 1873. The diffraction limit is defined as λ/(4π), where λ is the wavelength of the radiation. Reaching the diffraction limit for X-rays emitted from a storage ring (approximately 10 pm rad) is highly desirable from a scientific perspective: not only would it bring X-ray microscopy to its limit, but material structure could be determined with much less X-ray damage and fast chemical reactions could be studied in situ. Currently, the electron beam travelling in a storage ring dilutes the X-ray emittance by orders of magnitude. Because this quantity determines the brilliance of the X-ray beam, reaching the X-ray diffraction limit is a case of reducing the electron-beam emittance as far as possible.

The emittance is defined as Cq*E2/N3, where Cq is the ring magnet-lattice constant, E is the electron energy and N is the number of dipole magnets. It has two components: horizontal (given by the magnet lattice and electron energy) and vertical (which is mainly caused by coupling from the horizontal emittance). While the vertical emittance is, in principle, controllable and small compared with the horizontal emittance, the latter has to be minimised by choosing an optimised magnet lattice with a large number of magnet elements.

Because Cq can be brought to the theoretical minimum emittance limit and E is given by the desired spectral range of the X-rays, the only parameter remaining with which we can decrease the electron-beam emittance is N. Simply increasing the number of achromats to increase N turns out not to be a practical solution, however, because the rings are too big and expensive and/or the electrons tend to be unstable and leave the ring. However, a clever compromise called the multi-bend achromat (MBA), based on compact magnets and vacuum chambers, allows more elements to be incorporated around a storage ring without increasing its diameter, and in principle this design could allow a future storage ring to achieve the diffraction limit.

 

The post MAX IV paves the way for ultimate X-ray microscope appeared first on CERN Courier.

]]>
https://cerncourier.com/a/max-iv-paves-the-way-for-ultimate-x-ray-microscope/feed/ 1 Feature Novel machine lattice produces brightest ever X-ray beams. https://cerncourier.com/wp-content/uploads/2016/09/CCMayMax-fig1.jpg
CERN experiment points to a cloudier pre-industrial climate https://cerncourier.com/a/cern-experiment-points-to-a-cloudier-pre-industrial-climate/ https://cerncourier.com/a/cern-experiment-points-to-a-cloudier-pre-industrial-climate/#respond Fri, 08 Jul 2016 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-experiment-points-to-a-cloudier-pre-industrial-climate/ New results reported in two papers in Nature from the CLOUD experiment at CERN imply that the pre-industrial climate may have had brighter and more extensive clouds than previously thought, sharpening our understanding of the impact of human activities on climate. CLOUD (Cosmics Leaving Outdoor Droplets) is designed to understand how aerosol particles form and […]

The post CERN experiment points to a cloudier pre-industrial climate appeared first on CERN Courier.

]]>

New results reported in two papers in Nature from the CLOUD experiment at CERN imply that the pre-industrial climate may have had brighter and more extensive clouds than previously thought, sharpening our understanding of the impact of human activities on climate. CLOUD (Cosmics Leaving Outdoor Droplets) is designed to understand how aerosol particles form and grow in the atmosphere, and the effect this has on clouds and climate. It comprises a 26 m3 vacuum chamber containing atmospheric particles, into which beams of charged pions are fired from the Proton Synchrotron to mimic the seeding of clouds by galactic cosmic rays.

The increase in aerosols and clouds since pre-industrial times is one of the largest sources of uncertainty in climate change, according to the Intergovernmental Panel on Climate Change. The new CLOUD results show that organic vapours emitted by trees produce abundant aerosol particles in the atmosphere in the absence of sulphuric acid. Previously, it was thought that sulphuric acid – which largely arises from burning fossil fuels – was essential to initiate aerosol particle formation. CLOUD finds that oxidized biogenic vapours dominate particle growth in unpolluted environments, starting just after the first few molecules have stuck together and continuing all the way up to sizes above 50–100 nm, where the particles can seed cloud droplets.

The experiment also finds that ions from galactic cosmic rays enhance the production rate of pure biogenic particles by a factor of 10–100 compared with particles without ions, which suggests that cosmic rays played a more important role in aerosol and cloud formation in pre-industrial times than they do in today’s polluted atmosphere.

CLOUD, which has produced a series of high-impact publications following its first results in 2011, is the first experiment to reach the demanding technological performance and ultralow contaminant levels necessary to be able to measure aerosol nucleation and growth under controlled conditions in the laboratory.

The post CERN experiment points to a cloudier pre-industrial climate appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-experiment-points-to-a-cloudier-pre-industrial-climate/feed/ 0 News
TPS exceeds design goal of 500 mA stored current https://cerncourier.com/a/tps-exceeds-design-goal-of-500-ma-stored-current/ https://cerncourier.com/a/tps-exceeds-design-goal-of-500-ma-stored-current/#respond Fri, 18 Mar 2016 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/tps-exceeds-design-goal-of-500-ma-stored-current/ In December last year, the 3 GeV Taiwan Photon Source (TPS) of the National Synchrotron Radiation Research Center (NSRRC) stored 520 mA of electron current in its storage ring, and gave the world a bright synchrotron light as the International Year of Light 2015 came to an end. This is the second phase of commissioning conducted after […]

The post TPS exceeds design goal of 500 mA stored current appeared first on CERN Courier.

]]>
In December last year, the 3 GeV Taiwan Photon Source (TPS) of the National Synchrotron Radiation Research Center (NSRRC) stored 520 mA of electron current in its storage ring, and gave the world a bright synchrotron light as the International Year of Light 2015 came to an end. This is the second phase of commissioning conducted after the five-month preparation work set to bring the electron current of TPS to its design value of 500 mA (CERN Courier June 2010 p16 and April 2015 p22).

After the first light of TPS shone on 31 December 2014, the beam injection stored an electron current greater than 100 mA with the efficiency of the booster to storage ring exceeding 75% using Petra cavities. To overcome the instability of the electron beam, high chromaticity and a vertical feedback system were applied to damp the vertical instability at a high current, in this case close to 100 mA, whereas the longitudinal instability appeared when the beam current reached around 85 mA. Subsequently, the dynamic pressure of the vacuum conditioning reached 10–7 Pa at 100 mA after feeding 35 amps-per-hour beam dose. At this stage, the TPS was ready for the upgrade implementation scheduled for the remainder of 2015.

Several new components were installed during this phase, including new undulators and superconducting cavities, while the cryogenic and control systems were completed.

The upgrade activities also involved the injection system and the transfer line between booster and storage, to improve the injection efficiency and the stability of the system. In addition, 96 fast-feedback corrector magnets were placed at both ends of the straight sections, as well as upstream of the dipole magnets.

After several test runs in the fourth quarter of 2015, an unusual and unfamiliar phenomenon began to emerge, preventing the electron current from progressing beyond 230 mA. The pressure of the vacuum chamber located in the first dipole of the second arc section in the storage ring repeatedly surged to more than 300 times the normal value of 10 × 10–9 Pa when the beam current increased to 190 mA. A small metal–plastic pellet that contaminated the vacuum environment was removed and the staff performed flange welding on the spot.

After the vacuum problem had been solved, commissioning of TPS went smoothly, ramping from 0 to 520 mA in 11 minutes on 12 December.

While the TPS was ramping up to its stored-current target value, two beamlines – the protein microcrystallography beamline (TPS-05) and the temporally coherent X-ray diffraction beamline (TPS-09) – were in the commissioning phase. The TPS beamlines will be open for use in 2016.

The post TPS exceeds design goal of 500 mA stored current appeared first on CERN Courier.

]]>
https://cerncourier.com/a/tps-exceeds-design-goal-of-500-ma-stored-current/feed/ 0 News
Science with a medical PET cyclotron https://cerncourier.com/a/science-with-a-medical-pet-cyclotron/ https://cerncourier.com/a/science-with-a-medical-pet-cyclotron/#respond Fri, 18 Mar 2016 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/science-with-a-medical-pet-cyclotron/ Résumé Faire de la recherche avec un cyclotron PET médical Au-delà de la production courante de radio-isotopes pour l’imagerie médicale, les cyclotrons PET compacts peuvent être au cœur d’installations de recherche multidisciplinaires. C’est le cas au laboratoire cyclotron de Berne (BTL), conçu pour une utilisation de l’accélérateur dans des buts scientifiques, parallèlement à la production […]

The post Science with a medical PET cyclotron appeared first on CERN Courier.

]]>
Résumé

Faire de la recherche avec un cyclotron PET médical

Au-delà de la production courante de radio-isotopes pour l’imagerie médicale, les cyclotrons PET compacts peuvent être au cœur d’installations de recherche multidisciplinaires. C’est le cas au laboratoire cyclotron de Berne (BTL), conçu pour une utilisation de l’accélérateur dans des buts scientifiques, parallèlement à la production de radio-isotopes. Au fil des années, l’installation est devenue le principal instrument pour toute une série d’activités de recherche auxquelles participent des équipes de physiciens, de chimistes, de pharmaciens et de biologistes.

Particle accelerators are fundamental instruments in modern medicine, where they are used to study the human body and to detect and cure its diseases. Instrumentation issued by fundamental research in physics is very common in hospitals. This includes positron emission tomography (PET) and cancer hadrontherapy.

To match the needs of a continuously evolving field and to fulfil the stringent requirements of hospital-based installations, specific particle accelerators have been developed in recent years. In particular, modern medical cyclotrons devoted to proton cancer treatments and to the production of radioisotopes for diagnostics and therapy are compact, user-friendly, affordable and able to ensure very high performance.

Medical PET cyclotrons usually run during the night or early in the morning, for the production of radiotracers that will be used for imaging. Their beams, featuring about 20 MeV energy and currents of the order of 100 μA, are in principle available for other purposes during the daytime. This represents an opportunity to exploit the science potential of these accelerators well beyond medical-imaging applications. In particular, they can be optimised to produce beams in the picoampere and nanoampere range, opening the way to nuclear and detector physics, material science, radiation biophysics, and radiation-protection research.

On the other hand, to perform cutting-edge multidisciplinary research, beams of variable shape and intensity must be available, together with the possibility to access the beam area. This cannot be realised in standard medical PET cyclotron set-ups, where severe access limitations occur due to radiation-protection issues. Furthermore, the targets for the production of PET radioisotopes are directly mounted on the cyclotron right after extraction, with consequent limitations in the use of the beams. To overcome these problems, medical PET cyclotrons can be equipped with a transport line leading the beam to a second bunker, which can always be accessible for scientific activities.

The Bern cyclotron laboratory

The Bern medical PET cyclotron laboratory was conceived to use the accelerator for scientific purposes in parallel with radioisotope production. It is situated in the campus of the Inselspital, the Bern University hospital, and has been in operation since 2013. The heart of the facility consists of an 18 MeV cyclotron providing single or dual beams of H ions. A maximum extracted current of 150 μA is obtained by stripping the negative ions. Targets can be located in eight different out-ports. Four of them are used for fluorine-18 production, one is equipped with a solid target station, and one is connected to a 6 m-long beam transfer line (BTL). The accelerator is located inside a bunker, while a second bunker with independent access hosts the BTL and is fully dedicated to research. The beam optics of the BTL is realised by one horizontal and one vertical steering magnet, together with two quadrupole doublets, one in the cyclotron bunker and the other in the research area. A neutron shutter prevents neutrons from entering the research bunker during routine production, avoiding radiation damage to scientific instrumentation. The BTL, rather unusually for a hospital cyclotron, represents the pillar of this facility. Although initially more expensive than a standard PET cyclotron facility, this solution ensures complete exploitation of the accelerator beam time and allows for synergy among academic, clinical and industrial partners.

Multidisciplinary research activities

The Bern facility carries out full, multidisciplinary research activities by a team of physicists, chemists, pharmacists and biologists. The BTL and the related physics laboratory have so far been the main instrument for carrying out research on particle detectors, accelerator physics, radiation protection, and novel radioisotopes for diagnostics and therapy.

To reach beam currents down to the picoampere range, a specific method was developed based on tuning the ion source, the radiofrequency and the current in the main coil. These currents are far below those employed for radioisotope production, and PET cyclotrons are not equipped with sensitive enough instrumentation. A novel compact-profile monitor detector was conceived and built to measure, control and use these low-intensity beams. A scintillating fibre crossing the beam produces light that can be collected to measure its profile. Specific doped-silica scintillating fibres were produced in collaboration with the Institute of Applied Physics (IAP) in Bern. A wide-intensity-range beam-monitoring detector was realised, able to span currents from 1 pA to 20 μA. The versatility of the instrument attracted the interest of industry, becoming a spin-off of the research activity. Moreover, the beam monitor was used to measure the transverse beam emittance of cyclotrons, opening the way to further accelerator-physics developments.

The large amount of daily produced fluorine-18 requires a complex radiation-protection monitoring system consisting of about 40 detectors. Besides γ and neutron monitoring, special care is paid to air contamination – a potential danger for workers and the population. This system is both a safety and research tool. Radioactivity induced in the air by proton and neutron beams was studied and the produced activity measured. The results were in good agreement with calculations based on excitation functions, and can be used for the assessment of radioactivity induced in air by proton and neutron beams in the energy range of PET cyclotrons. A direct application of this study is the assessment of radiation protection for scientific activities requiring beam extraction into air.

Another distinctive feature of the Bern cyclotron is its radio-pharmacy, conceived to bring together industrial production for medicine and scientific research. It features three Good Manufacturing Practice (GMP)-qualified laboratories, among which one is fully devoted to research. The existence of this laboratory and of the BTL brought together physicists and radiochemists of the University of Bern and of the Paul Scherrer Institute (PSI), triggering a multidisciplinary project funded by the Swiss National Science Foundation (SNSF). Scandium-43 is proposed as a novel PET radioisotope, having nearly ideal nuclear-decay properties for PET. Furthermore, scandium is suitable for theranostics (combined diagnostics and therapy). The same biomolecule can in fact be labelled with a positron-emitting isotope for imaging and a β one for cancer therapy. Advances in nuclear medicine will only be possible if suitable quantities of scandium-43 are available. The goal of the project is to produce clinically relevant amounts of this radioisotope with a quality appropriate for clinical trials.

The results described above represent examples of the wide spectrum of research activities that can be pursued at the Bern facility. Several other fields can be addressed, such as the study of materials by PIXE and PIGE ion-beam analysis, irradiation of biological samples, and investigation of the radiation hardness of scientific instrumentation.

The organisation of a facility of this kind naturally triggers national and international collaborations. The 12th workshop of the European Cyclotron Network (CYCLEUR) will take place in Bern on 23–24 June 2016, to bring together international experts. Last but not least, students and young researchers can profit from unique training opportunities in a stimulating, multidisciplinary environment, to move towards further advances in the application of particle-physics technologies.

• For further reading, visit arxiv.org/abs/1601.06820.

The post Science with a medical PET cyclotron appeared first on CERN Courier.

]]>
https://cerncourier.com/a/science-with-a-medical-pet-cyclotron/feed/ 0 Feature
Data sonification enters the biomedical field https://cerncourier.com/a/data-sonification-enters-the-biomedical-field/ https://cerncourier.com/a/data-sonification-enters-the-biomedical-field/#respond Fri, 12 Feb 2016 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/data-sonification-enters-the-biomedical-field/ Résumé La sonification des données fait son entrée dans le domaine biomédical La musique et les sciences de la vie ont beaucoup d’affinités : les deux disciplines font intervenir les concepts de cycles, de périodicité, de fluctuations, de transition et même, curieusement, d’harmonie. En utilisant la technique de la sonification, les scientifiques sont capables de […]

The post Data sonification enters the biomedical field appeared first on CERN Courier.

]]>
Résumé

La sonification des données fait son entrée dans le domaine biomédical

La musique et les sciences de la vie ont beaucoup d’affinités : les deux disciplines font intervenir les concepts de cycles, de périodicité, de fluctuations, de transition et même, curieusement, d’harmonie. En utilisant la technique de la sonification, les scientifiques sont capables de percevoir et de quantifier la coordination des mouvements du corps humain, ce qui permet d’améliorer notre connaissance et notre compréhension du contrôle moteur en tant que système dynamique auto-organisé passant par des états stables et instables en fonction de changements dans les contraintes s’exerçant au niveau de l’organisme, des tâches et de l’environnement.

Resonances, periodicity, patterns and spectra are well-known notions that play crucial roles in particle physics, and that have always been at the junction between sound/music analysis and scientific exploration. Detecting the shape of a particular energy spectrum, studying the stability of a particle beam in a synchrotron, and separating signals from a noisy background are just a few examples where the connection with sound can be very strong, all sharing the same concepts of oscillations, cycles and frequency.

In 1619, Johannes Kepler published his Harmonices Mundi (the “harmonies of the world”), a monumental treatise linking music, geometry and astronomy. It was one of the first times that music, an artistic form, was presented as a global language able to describe relations between time, speed, repetitions and cycles.

The research we are conducting is based on the same ideas and principles: music is a structured language that enables us to examine and communicate periodicity, fluctuations, patterns and relations. Almost every notion in life sciences is linked with the idea of cycles, periodicity, fluctuations and transitions. These properties are naturally related to musical concepts such as pitch, timbre and modulation. In particular, vibrations and oscillations play a crucial role, both in life sciences and in music. Take, for example, the regulation of glucose in the body. Insulin is produced from the pancreas, creating a periodic oscillation in blood insulin that is thought to stop the down-regulation of insulin receptors in target cells. Indeed, these oscillations in the metabolic process are so key that constant inputs of insulin can jeopardise the system.

Oscillations are also the most crucial concept in music. What we call “sound” is the perceived result of regular mechanical vibrations happening at characteristic frequencies (between 20 and 20,000 times per second). Our ears are naturally trained to recognise the shape of these oscillations, their stability or variability, the way they combine and their interactions. Concepts such as pitch, timbre, harmony, consonance and dissonance, so familiar to musicians, all have a formal description and characterisation that can be expressed in terms of oscillations and vibrations.

Many human movements are cyclic in nature. An important example is gait – the manner of walking or running. If we track the position of any point on the body in time, for example the shoulder or the knee, we would see it describing a regular, cyclic movement. If the gait is stable, as in walking at a constant speed, the frequency associated would be regular, with small variations due to the inherent variability of the system. By measuring, for example, the vertical displacement of the centre of each joint while walking or running, we would have a series of one-dimensional oscillating waveforms. The collection of these waveforms provides a representation of co-ordinated movement of the body. Studying their properties, such as phase relations, frequencies and amplitudes, then provides a way to investigate the order parameters that define modes of co-ordination.

Previous methods of examining the relation between components of the body have included statistical techniques such as principal-component analysis, or analysis of coupled oscillators through vector coding or continuous relative phase. However, a problem is that data are lost using statistical techniques, and small variations due to the inherent variability of the system are ignored. Conversely, a coupled oscillator can cope only with two components contributing to the co-ordination.

Sonograms to study body movements

Our approach is based on the idea of analysing the waveforms and their relations by translating them into audible signals and using the natural capability of the ear to distinguish, characterise and analyse waveform shapes, amplitudes and relations. This process is called data sonification, and one of the main tools to investigate the structure of the sound is the sonogram (sometimes also called a spectrogram). A sonogram is a visual representation of how the spectrum of a certain sound signal changes with time, and we can use sonograms to examine the phase relations between a large collection of variables without having to reduce the data. Spectral analysis is a particularly relevant tool in many scientific disciplines, for example in high-energy physics, where the interest lies in energy spectra, pattern and anomaly detections, and phase transitions.

Using a sonogram to examine the movement of multiple markers on the body in the frequency domain, we can obtain an individual and situation-specific representation of co-ordination between the major limbs. Because anti-phase frequencies cancel, in-phase frequencies enhance each other, and a certain degree of variability in the phase of the oscillation results in a band of frequencies, we are able to represent the co-ordination within the system through the resulting spectrogram.

In our study, we can see exactly this. A participant ran on a treadmill that was accelerating between speeds of 0 and 18 km/h for two minutes. A motion-analysis system was used to collect 3D kinematic data from 24 markers placed bilaterally on the head, neck, shoulders, elbows, wrists, hand, pelvis, hips, knees, heels, ankles and toes of the participant (sampling frequency 100 Hz, trial length 120 s). Individual and combined sensor measurements were resampled to generate audible waveforms. Sonograms were then computed using moving-frequency Hanning analysis windows for all of the sound signals computed for each marker and combination of markers.

Sonification of individual and combined markers is shown above right. Sonification of an individual marker placed on the left knee (top left in the figure) shows the frequencies underpinning the marker movement on that particular joint-centre. By combining the markers, say of a whole limb such as the leg, we can examine the relations of single markers, through the cancellation and enhancement of frequencies involved. The result will show some spectral lines strengthening, others disappearing and others stretching to become bands (top right). The nature of the collective movements and oscillations that underpin the mechanics of an arm or a leg moving regularly during the gait can then be analysed through the sound generated by the superposition of the relative waveforms.

A particularly interesting case appears when we combine audifications of marker signals coming from opposing limbs, for example left leg/right arm or right leg/left arm. The sonogram bottom left in the figure is the representation of the frequency content of the oscillations related to the combined sensors on the left leg and the right arm (called additive synthesis, in audio engineering). If we compare the sonogram of the left leg alone (top right) and the combination with the opposing arm, we can see that some spectral lines disappear from the spectrum, because of the phase opposition between some of the markers, for example the left knee and the right elbow, the left ankle and the right hand.

The final result of this cancellation is a globally simpler dynamical system, described by a smaller number of frequencies. The frequencies themselves, their sharpness (variability) and the point of transition provide key information about the system. In addition, we are able to observe and hear the phase transition between the walking and running state, indicating that our technique is suitable for examining these order-parameter states. By examining movement in the frequency domain, we obtain an individual and situation-specific representation of co-ordination between the major limbs.

Sonification of movement as audio feedback

Sonification, as in the example above, does not require data reduction. It can provide us with unique ways of quantifying and perceiving co-ordination in human movement, contributing to our knowledge and understanding of motor control as a self-organised dynamical system that moves through stable and unstable states in response to changes in organismic, task and environmental constraints. For example, the specific measurement described above is a tool to increase our understanding of the adaptability of human motor control to something like a prosthetic limb. The application of this technique will aid diagnosis and tracking of pathological and perturbed gait, for example highlighting key changes in gait with ageing or leg surgeries.

In addition, we can also use sonification of movements as a novel form of audio feedback. Movement is key to healthy ageing and recovery from injuries or even pathologies. Physiotherapists and practitioners prescribe exercises that take the human body through certain movements, creating certain forces. The precise execution of these exercises is fundamental to the expected benefits, and while this is possible under the watchful eye of the physiotherapist, it can be difficult to achieve when alone at home.

In precisely executing exercises, there are three main challenges. First, there is the patient’s memory of what the correct movement or exercise should look like. Second, there is the ability of the patient to execute correctly the movement that they are required to do, working the right muscles to move the joints and limbs through the correct space, over the right amount of time or through an appropriate amount of force. Last, finding the motivation to perform sometimes painful, strenuous or boring exercises, sometimes many times a day, is a challenge.

Sonification can provide not only real-time audio feedback but also elements of feed-forward, which provides a quantitative reference for the correct execution of movements. This means that the patient has access to a map of the correct movements through real-time feedback, enabling them to perform correctly. And let’s not forget about motivation. Through sonification, in response to the movements, the patient can generate not only waveforms but also melodies and sounds that are pleasing.

Another possible application of generating melodies associated with movement is in the artistic domain. Accelerometers, vibration sensors and gyroscopes can turn gestures into melodic lines and harmonies. The demo organised during the public talk of the International Conference on Translational Research in Radio-Oncology – Physics for Health in Europe (ICTR-PHE), on 16 February in Geneva, was based on that principle. Using accelerometers connected to the arm of a flute player, we could generate melodies related to the movements naturally occurring when playing, in a sort of duet between the flute and the flutist. Art and science and music and movement seem to be linked in a natural but profound way by a multitude of different threads, and technology keeps providing the right tools to continue the investigation just as Kepler did four centuries ago.

The post Data sonification enters the biomedical field appeared first on CERN Courier.

]]>
https://cerncourier.com/a/data-sonification-enters-the-biomedical-field/feed/ 0 Feature
Networking against cancer with ENLIGHT https://cerncourier.com/a/networking-against-cancer-with-enlight/ https://cerncourier.com/a/networking-against-cancer-with-enlight/#respond Fri, 15 Jan 2016 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/networking-against-cancer-with-enlight/ The European Network for Light Ion Hadron therapy (ENLIGHT) adapts to the rapidly developing hadron-therapy scene.

The post Networking against cancer with ENLIGHT appeared first on CERN Courier.

]]>

Since the establishment of the first hospital-based proton-treatment centres in the 1990s, hadrontherapy has continued to progress in Europe and worldwide. In particular, during the last decade there has been exponential growth in the number of facilities, accompanied by a rapid increment in the number of patients treated, an expanded list of medical indications, and increasing interest in other types of ions, especially carbon. Harnessing the full potential of hadrontherapy requires the expertise and ability of physicists, physicians, radiobiologists, engineers, and information-technology experts, as well as collaboration between academic, research and industrial partners. Thirteen years ago, the necessity to catalyse efforts and co-operation among these disciplines led to the establishment of the European Network for Light Ion Hadrontherapy (ENLIGHT). Its recent annual meeting, held in Cracow in September, offered an ample overview of the current status and challenges of hadrontherapy, as well as stimulating discussion on the future organisation of the community.

Networking is key

ENLIGHT was launched in 2002 (CERN Courier May 2002 p29) with an ambitious, visionary and multifaceted plan to steer European research efforts in using ion beams for radiation therapy. ENLIGHT was envisaged not only as a common multidisciplinary platform, where participants could share knowledge and best practice, but also as a provider of training and education, and as an instrument to lobby for funding in critical research and innovation areas. During the years, the network has evolved, adapting its structure and goals to emerging scientific needs (CERN Courier June 2006 p27).

ICTR-PHE

The third edition of the International Conference on Translational Research in Radio-Oncology | Physics for Health in Europe will be held in Geneva from 15 to 19 February. This unique conference gathers scientists from a variety of fields, including detector physicists, radiochemists, nuclear-medicine physicians and other physicists, biologists, software developers, accelerator experts and oncologists.

ICTR-PHE is a biennial event, co-organised by CERN, where the main aim is to foster multidisciplinary research by positioning itself at the intersection of physics, medicine and biology. At ICTR-PHE, physicists, engineers and computer scientists share their knowledge and technologies, while doctors and biologists present their needs and vision for the medical tools of the future, therefore triggering breakthrough ideas and technological developments in specific areas.

The high standards set by the ICTR-PHE conferences have garnered not only an impressive scientific community, but also ever-increasing interest and participation from industry. ICTR-PHE 2016 is also an opportunity for companies to exhibit their products and services at the technical exhibition included in the programme.

The annual ENLIGHT meeting has always played a defining role in this evolutionary process. This year, new and long-time members were challenged to an open discussion on the future of the network, after a day and a half of inspiring talks on various aspects of hadrontherapy.

Challenges ahead

Emerging topics in all forms of radiation therapy are the collection, transfer and sharing of medical data, and the implementation of big data-analytics tools to inspect them. These tools will be crucial in implementing decision support systems, allowing treatment to be tailored to each individual patient. The flow of information in healthcare, and in particular in radiation therapy, is overwhelming not only in terms of data volume but also in terms of the diversity of data types involved. Indeed, experts need to analyse patient and tumour data, as well as complex physical dose arrays, and to correlate these with clinical outcomes that also have genetic determinants.

Hadrontherapy is facing a dilemma when it comes to designing clinical trials. In fact, from a clinical standpoint, the ever increasing number of hadrontherapy patients would allow randomised trials to be performed – that is, systematic clinical studies in which patients are treated with comparative methods to determine which is the most effective curative protocol.

However, several considerations add layers of complexity to the clinical-trials landscape: the need to compare standard photon radiotherapy not only with protons but also with carbon ions; the positive results of hadrontherapy treatments for main indications; and the non-negligible fact that most of the patients who contact a hadrontherapy centre are well informed about the technique, and will not accept being treated with conventional radiotherapy. Nevertheless, progress on clinical trials is being made. At the ENLIGHT meeting in Cracow, the two dual-ion (proton and carbon) centres in Europe – HIT, in Heidelberg (Germany) and CNAO, in Pavia (Italy) – presented patient numbers and dose-distribution studies carried out at their facilities. The data were collected mainly in cohort studies carried out within a single institution, and the results often highlighted the need for larger statistics and a unified database. More data from patients treated with carbon ions will soon become available, with the opening in 2016 of the MedAustron hadrontherapy centre in Wiener Neustadt (Austria). Clinical trials are also a major focus outside of Europe: in the US, several randomised and non-randomised trials have been set up to compare protons with photons, and to investigate either the survival improvement (for glioblastoma, non-small cell lung cancer, hepatocellular carcinoma, and oesophageal cancer) or the decrease of adverse effects (low-grade glioma, oropharyngeal cancer, nasopharyngeal cancer, prostate cancer and post-mastectomy radiotherapy in breast cancer). Recently, the National Cancer Institute in the US funded a trial comparing conventional radiation therapy and carbon ions for pancreatic cancer.

Besides clinical trials, personalised treatments are holding centre stage in the scientific debate on hadrontherapy. Technology is not dormant: developments are crucial to reduce the costs, to provide treatments tailored to each specific case, and to reach the necessary level of sophistication in beam delivery to treat complex cases such as tumours inside, or close to, moving organs. In this context, imaging is key. Today, it is becoming obvious that the optimal imaging tool will necessarily have to combine different imaging modalities, for example PET and prompt photons. PET is of course a mainstay for dose imaging, but a well-known issue in its application to in-beam real-time monitoring for hadrontherapy comes from having to allow room for the beam nozzle: partial-ring PET scanners cannot provide full angular sampling, therefore introducing artefacts in the reconstructed images. The time-of-flight (TOF) technique is often used to improve the image-reconstruction process. An innovative concept, called a J-PET scanner, detects back-to-back photons in plastic scintillators, and applies compressive sensing theory to obtain a better signal normalisation, and therefore improve the TOF resolution.

A subject of broad and current interest within the hadrontherapy community is radiobiology. There has been great progress in the comprehension of molecular tumour response to irradiation with both ions and photons, and of the biological consequences of the complex, less repairable DNA damage caused specifically by ions. Understanding the cell signalling mechanisms affected by hadrontherapy will lead to improvements in therapeutic efficacy. A particularly thorny issue is the relative biological effectiveness (RBE) of protons and carbon with respect to photons. More extensive and systematic radiobiology studies with different ions, under standardised dosimetry and laboratory conditions, are needed to clarify this and other open issues: these could be carried out at existing and future beamlines at HIT, CNAO and MedAustron, as well as at the proposed CERN OpenMED facility.

The future of ENLIGHT

Since the annual meeting in Summer 2014, the ENLIGHT community has started to discuss the future of the network, both in terms of structure and scientific priorities. It is clear that the focus of R&D for hadrontherapy has shifted since the birth of ENLIGHT, if only for the simple reason that the number of clinical centres (in particular for protons) has dramatically increased. Also, while technology developments are still needed to ensure optimal and more cost-effective treatment, proton therapy is now solidly in the hands of industry. The advent of single-room facilities will bring proton therapy, albeit with some restrictions, to smaller hospitals and clinical centres.

From a clinical standpoint, the major challenge for ENLIGHT in the coming years will be to catalyse collaborative efforts in defining a road map for randomised trials and in studying the issue of RBE in detail. Concerning technology developments, efforts will continue on quality assurance through imaging and on the design of compact accelerators and gantries for ions heavier than protons. Information technologies will take centre stage, because data sharing, data analytics, and decision support systems will be key topics.

Training will be a major focus in the coming years, as the growing number of facilities require more and more trained personnel: the aim will be to train professionals who are highly skilled in their speciality but at the same time are familiar with the multidisciplinary aspects of hadrontherapy.

Over the years, the ENLIGHT community has shown a remarkable ability to reinvent itself, while maintaining its cornerstones of multidisciplinarity, integration, openness, and attention to future generations. The new list of priorities will allow the network to tackle the latest challenges of a frontier discipline such as hadrontherapy in the most effective way.

The post Networking against cancer with ENLIGHT appeared first on CERN Courier.

]]>
https://cerncourier.com/a/networking-against-cancer-with-enlight/feed/ 0 Feature The European Network for Light Ion Hadron therapy (ENLIGHT) adapts to the rapidly developing hadron-therapy scene. https://cerncourier.com/wp-content/uploads/2016/01/CCenl1_01_16.jpg
Novel radionuclides to kill cancer https://cerncourier.com/a/novel-radionuclides-to-kill-cancer/ https://cerncourier.com/a/novel-radionuclides-to-kill-cancer/#respond Wed, 28 Oct 2015 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/novel-radionuclides-to-kill-cancer/ A new radiolabelled molecule obtained by the association of a 177Lu isotope and a somatostatin-analogue peptide is showing potential as a cancer killer for certain types of tumour. It is being developed by Advanced Accelerator Applications (AAA), a radiopharmaceutical company that was set up in 2002 by Stefano Buono, a former CERN scientist. With its […]

The post Novel radionuclides to kill cancer appeared first on CERN Courier.

]]>
A new radiolabelled molecule obtained by the association of a 177Lu isotope and a somatostatin-analogue peptide is showing potential as a cancer killer for certain types of tumour. It is being developed by Advanced Accelerator Applications (AAA), a radiopharmaceutical company that was set up in 2002 by Stefano Buono, a former CERN scientist. With its roots in the nuclear-physics expertise acquired at CERN, AAA started its commercial activity with the production of radiotracers for medical imaging. The successful commercial activity made it possible for AAA to invest in nuclear research to produce innovative radiopharmaceuticals.

177Lu emits both a β particle, which can kill cancerous cells, and a γ ray, which can be useful for SPECT (Single-Photon Emission Computed Tomography) imaging. Advanced neuroendocrine tumours can be inoperable, and for many patients there are no therapeutic options. However, about 80% of all neuroendocrine tumours overexpress somatostatin receptors, and the radiolabelled molecule is able to selectively target those receptors. The new radiopharmaceutical acts by releasing the high-energy electrons after internalization in the tumour cells through the receptors. The tumour cells are destroyed by the radiation, and the drug is rapidly cleared from the body via urine. A complete treatment consists of only four injections, one every six to eight weeks.

The radiolabelled molecule is currently being used for the treatment of all neuroendocrine tumours on compassionate-use and named-patient basis in 10 European countries, and is seeking approval in both the EU and the US. A phase-III clinical trial (the NETTER-1 clinical study) conducted in 51 clinical centres in the US and Europe, is testing the product in patients with inoperable, progressive, somatostatin-receptor-positive, mid-gut neuroendocrine tumours. The results of this trial were presented on 27 September in a prestigious Presidential Session at the European Cancer Congress in Vienna, Austria. The NETTER-1 trial demonstrated that there is a statistically significant and clinically meaningful increase in progression-free survival in patients treated with the radiolabelled molecule, compared with patients treated under the current standard of care. The median progression-free survival (PFS) was not reached during the duration of the trial in the Lutathera arm and was 8.4 months in the comparative group (p < 0.0001, hazard ratio: 0.21).

Another labelling radionuclide, the 68Ga positron emitter, is a good candidate in the production of a novel radiotracer to be used in the precise diagnosis and follow-up of the family of diseases using PET (positron emission tomography).

The post Novel radionuclides to kill cancer appeared first on CERN Courier.

]]>
https://cerncourier.com/a/novel-radionuclides-to-kill-cancer/feed/ 0 News
LHCf makes the most of a special run https://cerncourier.com/a/lhcf-makes-the-most-of-a-special-run/ https://cerncourier.com/a/lhcf-makes-the-most-of-a-special-run/#respond Wed, 22 Jul 2015 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/lhcf-makes-the-most-of-a-special-run/ The motivation of LHCf is to understand the hadronic interactions taking place when high-energy cosmic rays collide with the Earth’s atmosphere.

The post LHCf makes the most of a special run appeared first on CERN Courier.

]]>
Run 2 of the LHC may only just have officially begun, but the Large Hadron Collider forward (LHCf) experiment has already completed its data taking with proton–proton collisions at the new high energy of 13 TeV in the centre of mass. The experiment collected data in a special physics run carried out on 8–12 June, just after the start of Run 2.

The motivation of LHCf is to understand the hadronic interactions taking place when high-energy cosmic rays collide with the Earth’s atmosphere, producing bunches of particles known as air showers. These air showers allow the observation of primary cosmic rays with energies from 1015 eV to beyond 1020 eV. Because a collision energy of 13 TeV corresponds to the interaction of a proton with an energy of 9 × 1016 eV hitting the atmosphere, the LHC enables an excellent test of what happens at the energy of the observed air showers.

The interaction relevant to air-shower development has a large cross-section, with most of the energy going into producing particles that are emitted in the very forward direction – that is, at very small angles to the direction of the incident particle. LHCf therefore uses two detectors, Arm 1 and Arm 2, installed at 140 m on either side of the interaction point in the ATLAS experiment (CERN Courier January/February 2015 p6).

For LHCf to be able to determine the production angle of individual particles, the experiment requires beams that are more parallel than in the usual LHC collisions. In addition, the probability for more than one collision in a single bunch crossing (pile-up) must be far smaller than unity, to avoid contamination from multiple interaction events. To meet these constraints, for the special run the beams were “unsqueezed” instead of being “squeezed”, making them larger at the collision points. This involved adjusting magnets on either side of the interaction point to increase β* – the parameter that characterizes the machine optics for the squeeze – to a value of 19 m. In addition, the collisions took place either with low beam intensities or with beams offset to each other to reduce pile-up.

The first collisions for physics (“stable beams”) were provided at midnight on 10 June with very low pile-up, followed until noon on 13 June by a total of six machine fills providing various pile-up values ranging from 0.003 to 0.05. This allowed LHCf to take more than 32 hours of physics data, as scheduled.

Even with a luminosity of 1029 cm–2 s–1 – five orders of magnitude below the nominal LHC luminosity – the LHCf detectors achieved a useful data rate of > 500 Hz, recording about 15% of inelastic interactions with neutral particles of energies > 100 GeV. A preliminary analysis during the run showed the clear detection not only of π0 mesons but also of η mesons, which had not been the case with the data at the collision energy of 7 TeV in Run 1.

A highlight of the operation was collaboration with the ATLAS experiment. During the special run, trigger signals in LHCf were sent to ATLAS, which recorded data accordingly. The analyses of such common events will enable the classification of events based on the nature of processes such as diffractive dissociation and non-diffractive interactions.

The LHCf detectors were removed from the LHC tunnel on 15 June during the first technical stop of the LHC, to avoid the radiation damage that would occur with the increasingly high luminosity for Run 2.

The post LHCf makes the most of a special run appeared first on CERN Courier.

]]>
https://cerncourier.com/a/lhcf-makes-the-most-of-a-special-run/feed/ 0 News The motivation of LHCf is to understand the hadronic interactions taking place when high-energy cosmic rays collide with the Earth’s atmosphere. https://cerncourier.com/wp-content/uploads/2015/07/CCnew2_06_15.jpg
From Physics to Daily Life: Applications in Informatics, Energy, and Environment and From Physics to Daily Life: Applications in Biology, Medicine and Healthcare https://cerncourier.com/a/bookshelf-144/ Tue, 02 Jun 2015 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/bookshelf-144/ Francois Grey reviews in 2015 From Physics to Daily Life: Applications in Informatics, Energy, and Environment and From Physics to Daily Life: Applications in Biology, Medicine and Healthcare.

The post From Physics to Daily Life: Applications in Informatics, Energy, and Environment and From Physics to Daily Life: Applications in Biology, Medicine and Healthcare appeared first on CERN Courier.

]]>
From Physics to Daily Life: Applications in Informatics, Energy, and Environment and From Physics to Daily Life: Applications in Biology, Medicine and Healthcare
By Beatrice Bressan (ed.)
Wiley-Blackwell
Hardback: £60 €75
E-book: £54.99 €66.99
(The prices are for each book separately)
Also available at the CERN bookshop

CCboo1_05_15

The old adage that “necessity is the mother of invention” explains, in a nutshell, why an institution like CERN is such a prolific source of new technologies. The extreme requirements of the LHC and its antecedents have driven researchers to make a host of inventions, many of which are detailed in these informative volumes that cover two broad areas of applications.

Eclectic is the word that comes to mind reading through the chapters of the two tomes that are all linked, in one way or another, to CERN. The editor, Beatrice Bressan, has done a valiant job of weaving together different styles and voices, from technical academic treatise to colourful first-hand account. For example, in one of his many insightful asides in a chapter entitled “WWW and More”, Robert Cailliau, a key contributor to the development of the World Wide Web, muses wryly that even after a 30-year career at CERN, it was not always clear to him what “CERN” meant.

Indeed, as the reader is reminded throughout these two books, CERN is the convenient shorthand for several closely connected organizations and networks, each with its own innovation potential. There’s the institution in Geneva whose staff consist primarily of engineers, technicians and administrators who run the facility. Then there’s the much more numerous global community of researchers that develop and manage giant experiments such as ATLAS. And underpinning all of this is the vast range of industrial suppliers, which provide most of the technology used at CERN, often through a joint R&D process with staff at CERN and its partner institutions.

From a purely utilitarian perspective, the justification for CERN surely lies in the contracts it provides to European industry. Without the billions of euros that have been cycled through European firms to build the LHC, there would be little political appetite for such a massive project. As explained in the introductory chapter by Bressan and Daan Boom – reproduced in both volumes, together with a chapter on Swiss spin-off – there has been a great deal of knowledge transfer thanks to these industrial contracts. Indeed, this more mundane part of CERN’s industrial impact may well dwarf many of the more visible examples of innovation illustrated in subsequent chapters.

CCboo2_05_15

Still, as several examples in these two volumes illustrate, there is no doubt that CERN can also generate the sort of “disruptive technologies” that shape our modern world. The web is the most stunning example, but major advances in particle accelerators and radiation sensors have had amazing knock-on effects on industry and society, too, as chapters by famous pioneers such as Ugo Amaldi and David Townsend illustrate clearly.

The question that journalists and other casual observers never cease to ask, though, is why has Europe not benefitted more directly from such breakthroughs? Why did touch screens, developed for the Super Proton Synchrotron control room, not lead to a slew of European high-tech companies? Why was it Silicon Valley and not some valley in Europe that reaped most of the direct commercial benefits of the web? Where are all of the digital start-ups that the hundreds of millions of euros invested in Grid technology were expected to generate?

Chapters on each of these technologies provide some clues to what the real challenge is. As Cailliau remarks wistfully, “WWW is an excellent example of a missed opportunity, but not by CERN.” In other words, to be successful, invention needs not only a scientific mother, it requires an entrepreneurial midwife, too. That is an area where Europe has been sorely lacking.

The only omission in these otherwise wide-ranging and well-researched books, in my opinion, is the lack of discussion on the central role of openness in CERN’s innovation strategy. Open science and open innovation are umbrella terms mentioned enthusiastically in the introductory chapter by Sergio Bertolucci, CERN’s director for research and computing. But there are no chapters dealing specifically with how open-access publication or open-source software and hardware – areas where CERN has for years been a global pioneer – have impacted knowledge transfer and innovation. Perhaps that is a topic broad enough for a third volume.

That said, there is, in these two volumes, already ample food for more thoughtful debate about successful knowledge management and technology transfer in and around European research organizations like CERN. If these books provoke such debate, and that debate leads to progress in Europe’s ability to transform innovations sparked by fundamental physics into applications that improve daily life, they will have made an important contribution

• For the colloquium held at CERN featuring talks by contributors to these two books, visit https://indico.cern.ch/event/331449/.

The post From Physics to Daily Life: Applications in Informatics, Energy, and Environment and From Physics to Daily Life: Applications in Biology, Medicine and Healthcare appeared first on CERN Courier.

]]>
Review Francois Grey reviews in 2015 From Physics to Daily Life: Applications in Informatics, Energy, and Environment and From Physics to Daily Life: Applications in Biology, Medicine and Healthcare. https://cerncourier.com/wp-content/uploads/2015/06/CCboo1_05_15.jpg
HEPTech: where academia meets industry https://cerncourier.com/a/heptech-where-academia-meets-industry/ https://cerncourier.com/a/heptech-where-academia-meets-industry/#respond Thu, 09 Apr 2015 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/heptech-where-academia-meets-industry/ Innovative approaches for technology transfer in Europe.

The post HEPTech: where academia meets industry appeared first on CERN Courier.

]]>
Representatives of HEPTech member organizations

Technologies developed for fundamental research in particle, astro-particle and nuclear physics have an enormous impact on everyday lives. To push back scientific frontiers in these fields requires innovation: new ways to detect one signal in a wealth of data, new techniques to sense the faintest signals, new detectors that operate in hostile environments, new engineering solutions that strive to improve on the best – and many others.

The scientific techniques and high-tech solutions developed by high-energy physics can help to address a broad range of challenges faced by industry and society – from developing more effective medical imaging and cancer diagnosis through positron-emission tomography techniques, to developing the next generation of solar panels using ultra-high vacuum technologies. However, it is difficult and costly not only for many organizations to carry out the R&D needed to develop new applications, products and processes, but also for scientists and engineers to turn their technologies into commercial opportunities.

The aim of the high-energy physics technology-transfer network – HEPTech – is to bring together leading European high-energy physics research institutions so as to provide academics and industry with a single point of access to the skills, capabilities, technologies and R&D opportunities of the high-energy physics community in a highly collaborative open-science environment. As a source of technology excellence and innovation, the network bridges the gap between researchers and industry, and accelerates the industrial process for the benefit of the global economy and wider society.

HEPTech is made up of major research institutions active in particle, astroparticle and nuclear physics. It has a membership of 23 institutions across 16 countries, including most of the CERN member states (see table). Detailed information about HEPTech member organizations and an overview of the network’s activities are published annually in the HEPTech Yearbook and are also available on the network’s website.

So, how was the network born? Jean-Marie Le Goff, the first co-ordinator and present chairman of HEPTech, explains: “Particle physics is a highly co-operative environment. The idea was to spread that spirit over to the Technology Transfer Offices.” So in 2008 a proposal was made to the CERN Council to establish a network of Technology Transfer Offices (TTOs) in the field of particle physics. The same year, Council approved the network for a pilot phase of three years, reporting annually to the European Strategy Session of Council. In the light of the positive results obtained over those three years, Council approved the continuation of the network’s activities and its full operation. “Since then it has grown – both in expanding the number of members and in facilitating bodies across Europe that can bring innovation from high-energy physics faster to industrial exploitation”, says Le Goff.

The primary objective of the HEPTech network is to enhance technology transfer (TT) from fundamental research in physics to society. Therefore, the focus is on furthering knowledge transfer (KT) from high-energy physics to other disciplines, industry and society, as well as on enhancing TT from fundamental research in physics to industry for the benefit of society. The network also aims to disseminate intellectual property, knowledge, skills and technologies across organizations and industry, and to foster collaborations between scientists, engineers and business. Another important task is to enable the sharing of best practices in KT and TT.

HEPTech’s activities are fully in line with its objectives. To foster the contacts with industry at the European level, the network organizes regular academia–industry matching events (AIMEs). These are technology-themed events that provide matchmaking between industrial capabilities and the needs of particle physics and other research disciplines. They are HEPTech’s core offering to its members and the wider community, and the network has an active programme in this respect. Resulting from joint efforts by the network and its members, the AIMEs usually attract about 100 participants from more than 10 countries (figures from 2014). Last year, the topics ranged from the dissemination of micropattern-gas-detector technologies beyond fundamental physics, through potential applications in the technology of controls, to fostering academia–industry collaboration for manufacturing large-area detectors for the next generation of particle-physics experiments, and future applications of laser technologies.

HEPTech has teamed up with the work package on relations with industry of the Advanced European Infrastructures for Detectors at Accelerators (AIDA) project

“The topics of the events are driven on the one hand by the technologies we have – it’s very much a push model. On the other hand, they are the results of the mutual effort between the network and its members, where the members have the biggest say because they put in a lot of effort”, says Ian Tracey, the current HEPTech co-ordinator. He believes that a single meeting between the right people from academia and industry is only the first step in the long process of initiating co-operation. To establish a project fully, the network should provide an environment for regular repetitive contact for similar people. To address this need, HEPTech looks at increasing the number of AIMEs from initially four up to eight events per year.

“The benefit of having HEPTech as a co-organizer of the AIMEs is clearly the European perspective”, says Katja Kroschewski, head of TT at DESY. “Having speakers from various countries enlarges the horizon of the events and allows coverage of the subject field across Europe. It is different from doing a local event – for instance, having companies only from Hamburg or just with the focus on Germany. As the research work concerned has an international scope, it absolutely makes sense to organize such events. It is good to have the input of HEPTech in shaping the programme of the event and to have the network’s support within the organizing committee as well.”

HEPTech has teamed up with the work package on relations with industry of the Advanced European Infrastructures for Detectors at Accelerators (AIDA) project (which was co-funded by the European Commission under FP7 in 2011–2014), to organize AIMEs on detectors, with a view to fostering collaboration with industry during the pre-procurement phase. A total of seven AIMEs were organized in collaboration with AIDA and the RD51 collaboration at CERN, covering most of the technology fields of importance for detectors at accelerators. HEPTech financed four of them. A total of 101 companies attended the events, giving an average of 14 companies per event. For technology topics where Europe could meet the needs of academia, the percentage of EU industry was about 90% or above, going down to 70% when the leading industry for a technology topic was in the US and/or Asia.

To help event organizers find pertinent academic and industrial players in the hundreds, sometimes thousands, of organizations active in a particular technology, CERN used graph-analysis techniques to develop a tool called “Collaboration spotting”. The tool automatically processes scientific publications, patents and data from various sources, selects pertinent information and populates a database that is later used to automatically generate interactive sociograms representing the activity occurring in individual technology fields. Organizations and their collaborations are displayed in a graph that makes the tool valuable for monitoring and assessing the AIMEs.

However, the findings from AIDA show that it is difficult to conduct an assessment of the impact on industry of an AIME. “To keep a competitive advantage, companies entering a partnership agreement with academia tend to restrict the circulation of this news as much as possible, at least until the results of the collaboration become commercially exploitable,” explains Le Goff. “Although it tends to take some years before becoming visible, an increase in the number of co-publications and co-patents among attendees is a good indicator of collaboration. Clearly some of them could have been initiated at preceding events or under other circumstances, but in any case, the AIME has contributed to fostering or consolidating these collaborations.”

Learning and sharing

Another area of activity is the HEPTech Symposium, which is dedicated to the support of young researchers in developing entrepreneurial skills and in networking. This annual event brings together researchers at an early stage in their careers who are working on potentially impactful technologies in fields related to astro-, nuclear and particle physics. For one week, HEPTech welcomes these Early Stage Researchers from around Europe, providing an opportunity for networking with commercially experienced professionals and TT experts and for developing their entrepreneurial potential.

The first HEPTech Symposium took place in June 2014 in Cardiff. The young researcher whose project attracted the greatest interest was awarded an expenses-paid trip around the UK to look for funding for his project. The 2015 symposium will be held in Prague on 31 May–6 June and will be hosted by Inovacentrum from the Czech Technical University in collaboration with ELI Beamlines and the Institute of Physics of the Academy of Sciences. HEPTech has established a competitive procedure for members that would like to host the event in future. Those interested have to demonstrate their capacity for organizing both a quality training programme and the entertainment of the participants.

CERN Council encouraged HEPTech to continue its activities and amplify its efforts

Providing opportunities for capacity-building and sharing best practice among its members is of paramount importance to HEPTech. The network is highly active in investigating and implementing novel approaches to TT. A dedicated workgroup on sharing best practices responds to requests from members that are organizing events on a number of subjects relevant to the institutions and their TT process. These include, for instance, workshops presenting cases on technology licensing, the marketing of science and technology ond others. Through workshops, the network is able to upscale the skills of its member institutions and provide capacity-building by sharing techniques and different approaches to the challenges faced within TT. These events – an average of four per year – are driven by the members’ enthusiasm to explore advanced techniques in KT and TT, and help to create a collaborative spirit within the network. The members provide significant assistance to the implementation of these events, including lecturers and workshop organization.

Bojil Dobrev, co-convener of the workgroup on best practices provides a recent example of best-practice transfer within the network, in which intellectual property (IP) regulations elaborated by a HEPTech workgroup were successfully used as a basis for development of IP regulations at Sofia University, Bulgaria. In 2013–2014, a survey focusing on the needs and skills of HEPTech members was conducted within the remit of this workgroup. The objectives were to identify the skills and potential of the HEPTech members and their requirement for support through the network, focusing mainly on the early stage (established recently) TTOs. The survey covered all aspects of a TTO’s operation – from organization and financing, through IP activities, start-ups, licensing and contacts with industry, to marketing and promotion. “The survey was used as a tool to investigate the demand of the TTOs. Its outcomes helped us to map HEPTech’s long-term strategy and to elaborate our annual work plan, particularly in relation to training and best-practice sharing”, explains Dobrev.

Taking into consideration the overall achievements of HEPTech and based on the annual reports of the network co-ordinator, CERN Council encouraged HEPTech to continue its activities and amplify its efforts in the update of the European Strategy for Particle Physics in May 2013. The following year, in September, the Council president gave strong support and feedback for HEPTech’s work.

HEPTech’s collaborative efforts with the European Extreme Light Infrastructure (ELI) project resulted in network membership of all three pillars of the project. Moreover, at the Annual Forum of the EU Strategy for the Danube Region, representatives of governments in the Danube countries acknowledged HEPTech’s role as a key project partner in the Scientific Support to the Danube Strategy initiative.

With its stated vision to become “the innovation access-point for accelerator- and detector-driven research infrastructures” within the next three years, HEPTech is looking to expand – indeed, three new members joined the network in December 2014. It also aims to take part in more European-funded projects and is seeking closer collaboration with other large-scale science networks, such as the European TTO Circle – an initiative of the Joint Research Centre of the European Commission, which aims to connect the TTOs of large European public research organizations.

The post HEPTech: where academia meets industry appeared first on CERN Courier.

]]>
https://cerncourier.com/a/heptech-where-academia-meets-industry/feed/ 0 Feature Innovative approaches for technology transfer in Europe. https://cerncourier.com/wp-content/uploads/2015/04/CChep1_03_15.jpg
Astroparticle, Particle, Space Physics and Detectors for Physics Applications: Proceedings of the 14th ICATPP Conference https://cerncourier.com/a/astroparticle-particle-space-physics-and-detectors-for-physics-applications-proceedings-of-the-14th-icatpp-conference/ Tue, 27 Jan 2015 12:56:35 +0000 https://preview-courier.web.cern.ch/?p=104129 The reports from this conference review topics that range from cosmic-ray observations through high-energy physics experiments to advanced detector techniques.

The post Astroparticle, Particle, Space Physics and Detectors for Physics Applications: Proceedings of the 14th ICATPP Conference appeared first on CERN Courier.

]]>
By S Giani, C Leroy, L Price, P-G Rancoita and R Ruchti (eds)
World Scientific
Hardback: £117
E-book: £88

41kj1W740EL._SX312_BO1,204,203,200_

Exploration of the subnuclear world is done through increasingly complex experiments covering a range of energy in diverse environments, from particle accelerators and underground detectors to satellites in space. These research programmes call for new techniques, materials and instrumentation to be used in detectors, often of large scale. The reports from this conference review topics that range from cosmic-ray observations through high-energy physics experiments to advanced detector techniques.

The post Astroparticle, Particle, Space Physics and Detectors for Physics Applications: Proceedings of the 14th ICATPP Conference appeared first on CERN Courier.

]]>
Review The reports from this conference review topics that range from cosmic-ray observations through high-energy physics experiments to advanced detector techniques. https://cerncourier.com/wp-content/uploads/2022/08/41kj1W740EL._SX312_BO1204203200_.jpg
Two teams take big steps forward in plasma acceleration https://cerncourier.com/a/two-teams-take-big-steps-forward-in-plasma-acceleration/ https://cerncourier.com/a/two-teams-take-big-steps-forward-in-plasma-acceleration/#respond Tue, 27 Jan 2015 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/two-teams-take-big-steps-forward-in-plasma-acceleration/ The high electric-field gradients that can be set up in plasma have offered the promise of compact particle accelerators since the late 1970s.

The post Two teams take big steps forward in plasma acceleration appeared first on CERN Courier.

]]>
CCnew15_01_15th

The high electric-field gradients that can be set up in plasma have offered the promise of compact particle accelerators since the late 1970s. The basic idea is to use the space-charge separation that arises in the wake of either an intense laser pulse or a pulse of ultra-relativistic charged particles. Towards the end of 2014, groups working on both approaches reached important milestones. One team, working at the Facility for Advanced Accelerator Experimental Tests (FACET) at SLAC, demonstrated plasma-wakefield acceleration with both a high gradient and a high energy-transfer efficiency – a crucial combination not previously achieved. At Lawrence Berkeley National Laboratory, a team working at the Berkeley Lab Laser Accelerator (BELLA) facility boosted electrons to the highest energies ever recorded for the laser-wakefield technique.

CCnew16_01_15th

Several years ago, a team at SLAC successfully accelerated electrons in the tail of a long electron bunch from 42 GeV to 85 GeV in less than 1 m of plasma. In that experiment, the particles leading the bunch created the wakefield to accelerate those in the tail, and the total charge accelerated was small. Since then, FACET has come on line. Using the first 2 km of the SLAC linac to deliver an electron beam of 20 GeV, the facility is designed to produce pairs of high-current bunches with a small enough separation to allow the trailing bunch to be accelerated in the plasma wakefield of the drive bunch.

Using the pairs of bunches at FACET, some of the earlier team members together with new colleagues have carried out an experiment in the so-called “blow-out” regime of plasma-wakefield acceleration, where maximum energy gains at maximum efficiencies are to be found. The team succeeded in accelerating some 74 pC of charge in the core of the trailing bunch of electrons to about 1.6 GeV per particle in a gradient of about 4.4 GeV/m (Litos et al. 2014). The final energy spread for the core particles was as low as 0.7%, and the maxiumum efficiency of energy transfer from the wake to the trailing bunch was in excess of 30%.

Meanwhile, a team at Berkeley has been successfully pursuing laser-wakefield acceleration for more than a decade. This research was boosted when the specially conceived BELLA facility recently came on line with its petawatt laser. In work published in December, the team at BELLA used laser pulses at 0.3 PW peak power to create a plasma channel in a 9-cm-long capillary discharge waveguide and accelerate electrons to the record energy of 4.2 GeV (Leemans et al. 2014). Importantly, the 16 J of laser energy used was significantly lower than in previous experiments – a result of using the preformed plasma waveguide set up by pulsing an electrical discharge through hydrogen in a capillary. The combination of increased electron-beam energy and lower laser energy bodes well for the group’s aim to reach the target of 10 GeV.

The post Two teams take big steps forward in plasma acceleration appeared first on CERN Courier.

]]>
https://cerncourier.com/a/two-teams-take-big-steps-forward-in-plasma-acceleration/feed/ 0 News The high electric-field gradients that can be set up in plasma have offered the promise of compact particle accelerators since the late 1970s. https://cerncourier.com/wp-content/uploads/2015/01/CCnew15_01_15th.jpg
Nanotube cathode promises intense electron beam https://cerncourier.com/a/nanotube-cathode-promises-intense-electron-beam/ https://cerncourier.com/a/nanotube-cathode-promises-intense-electron-beam/#respond Mon, 27 Oct 2014 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/nanotube-cathode-promises-intense-electron-beam/ It looks like a small black button, but in tests at Fermilab’s High-Brightness Electron Source Lab it has produced beam currents greater than those generated with a large laser system.

The post Nanotube cathode promises intense electron beam appeared first on CERN Courier.

]]>
CCnew19_09_14

It looks like a small black button, but in tests at Fermilab’s High-Brightness Electron Source Lab it has produced beam currents 103–106 times greater than those generated with a large laser system. Designed by a collaboration led by RadiaBeam Technologies, a California-based technology firm actively involved in accelerator R&D, this electron source is based on a carbon-nanotube cathode only 15 mm across.

Carbon-nanotube cathodes have already been studied extensively in university research labs, but Fermilab is the first accelerator facility to test the technology within a full-scale setting. With its capability and expertise for handling intense electron beams, it is one of relatively few labs that can support a project like this.

Traditionally, accelerator scientists use lasers to strike cathodes to eject electrons through photoemission. With the nanotube cathode, a strong electric field pulls streams of electrons off the surface of the cathode though field emission. There were early concerns that the strong electric fields would cause the cathode to self-destruct. However, one of the first discoveries that the team made when it began testing in May was that the cathode did not explode. Instead, the exceptional strength of carbon nanotubes prevents the cathode from being destroyed. The team used around 22 MV/m to produce the target current of more than 350 mA.

The technology has extensive potential applications in medical equipment, for example, since an electron beam is a critical component in generating X-rays.

• A Department of Energy Small Business Innovation Research grant funds the RadiaBeam-Fermilab-Northern Illinois University collaboration.

The post Nanotube cathode promises intense electron beam appeared first on CERN Courier.

]]>
https://cerncourier.com/a/nanotube-cathode-promises-intense-electron-beam/feed/ 0 News It looks like a small black button, but in tests at Fermilab’s High-Brightness Electron Source Lab it has produced beam currents greater than those generated with a large laser system. https://cerncourier.com/wp-content/uploads/2014/10/CCnew19_09_14.jpg
CERN and ITER cooperate https://cerncourier.com/a/cern-and-iter-co-operate/ https://cerncourier.com/a/cern-and-iter-co-operate/#comments Tue, 23 Sep 2014 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-and-iter-co-operate/ The LHC and ITER project share many technologies, providing a natural basis for collaboration.

The post CERN and ITER cooperate appeared first on CERN Courier.

]]>

In November 2006, the last LHC dipole and quadrupole cold masses arrived at CERN, signalling the end of the industrial construction of the major components of the new 27-km particle collider (CERN Courier October 2006 p28 and January/February 2007 p25). The LHC then entered the installation and the commissioning phases. In the same month, at the Elysée Palace in Paris, the ITER Agreement was signed by seven parties: China, the EU, India, Japan, Korea, Russia and the US. The agreement’s ratification on October the following year marked the start of a new mega-science project – ITER, standing originally for the International Tokamak Experimental Reactor – that in many respects is the heir of the LHC. Both machines are based on, for example, a huge superconducting magnet system, large cryogenic plants of unmatched power, a large volume of ultra-high vacuum, a complex electrical powering system, sophisticated interlock and protection systems, high-technology devices and work in highly radioactive environments.

The two projects share many technologies and operating conditions and are both based on large international collaborations. These elements constitute the basis for a natural collaboration between the two projects, despite there being distinct differences between their managerial and sociological models.

In the years 2007–2012, CERN could not engage in new large projects, not only because effort was focussed on installation and commissioning of the LHC – and repair and consolidation (CERN Courier April 2009 p6) – but also because of budgetary constraints set by the repayment of loans for its construction. Many groups and departments at CERN faced a related reduction of personnel. In contrast, the new ITER organization had to be staffed and become immediately operational to organize the procurement arrangements between ITER and the domestic agencies acting for the seven members. Indeed, some new staff members were recruited from laboratories that had just finished their engagement with the LHC, such as the CEA in France and CERN itself. However, the number of staff, compounded by the need to train some of them, was not sufficient to satisfy ITER’s needs. For example, the ITER magnet system – perhaps the largest technical challenge of the whole project – required many further detailed studies before the design could be brought to sufficient maturity to allow hand-over to the domestic agencies for construction. The ITER magnet management was also interested in benefitting from the technical skills and project-management experience for large-scale procurement from industry that CERN had accumulated during construction of the LHC.

In addition to the primary reasons for collaboration between CERN and ITER, there were additional reasons that made it interesting to both parties. For CERN there was the possibility of conducting R&D and studies in key domains, despite the lack of new projects and internal funding. Examples include:

  • – the superconductor reference laboratory, set up for the ITER organization, which has proved to be useful for CERN’s internal programme, formally launched in 2011, for the new High Luminosity LHC;
  • – qualification of new commercial nuclear-radiation-hard optical fibres, with measurements also at cryogenic temperatures;
  • – design of high-temperature superconductor (HTS) 70-kA-class current leads, with sophisticated 3D simulations and experimental mock-ups;
  • – setting up a unique high-voltage laboratory for cryo-testing insulation and instrumentation equipment;
  • – new concepts and controllers for the HTS current leads and magnet protection units; and
  • – activities in metallurgy, welding and material testing, which have helped to increase CERN’s already world-renowned competence in this domain.

The list could be longer. Only a minor part of the activity was supplying a “service” or the transfer of knowledge. In many cases the activity was new design, new R&D or validation of beyond-state-of-the-art concepts.

For ITER, the benefit lay not only in receiving the services and studies, for which it paid. It was also in having access to a large spectrum of competence in a single organization. CERN could react promptly to the demands and needs stemming from contracts and unexpected difficulties in the multiparty complex system set up for ITER construction.

Discussions between CERN and ITER management started in 2007 and were formalized with a framework co-operation agreement signed at CERN by the director-generals of the two organizations on 6 March 2008. This agreement foresaw a co-ordination committee that was in fact not set up until 2012, and has met only twice so far, because the collaboration is working so smoothly that no issues have been raised. The collaboration was then implemented through contracts, called implementing agreements (IAs) to the co-operation agreement. Each IA contract details the specific content, goals, deliverables, time duration and resources.

Table 1 lists the 18 IAs signed so far between CERN and ITER. Each year from 2008, an IA was signed according to the needs of ITER and the possibilities and interest at CERN. Standard annual IAs span one calendar year and contain a variety of different tasks – these are annual IAs. However, IAs with extended durations of up to five years soon became necessary to secure long-term service by giving CERN the possibility of hiring new personnel in excess of those allowed by the internal budget. In total, CERN has had eight annual contracts so far, one short-term contract (IA12) and nine multiyear contracts, two of them lasting five years – one for operation of the superconductor reference laboratory (IA4) and one for metallurgy and material testing for the magnet system (IA14).

As already mentioned, the Co-ordination Committee was not set up until 2012, so the various agreements were overseen by a Steering Committee – later renamed the Technical Committee to distinguish it better from the Co-ordination Committee – which is composed of two members per party. The membership of these committees has been relatively constant and this continuity in management, with smooth changes, is probably one of the reasons for the success of the collaboration. Also, some IAs started outside the usual entry points and were later adjusted to report inside the framework. The CERN-ITER collaboration is a textbook case of how managing relations between complex organizations that are at the centre of a network of institutes is an endless job.

The steering and technical committees meet twice a year and each session is prepared carefully. The committee members review the technical work and resolve resource problems by reshuffling various tasks or setting up amendments to the IAs – which has happened only five times and never for extra costs, only to adjust the execution of work to needs. Long-term planning of work and of future agreements is done in the meetings for the best use of resources, and problems are tackled at their outset. So far, no disputes, even minor ones, have occurred.

As in any sustainable collaboration, there are deep discussions on the allocation of resources, most being personnel related, with only a minor part being about consumables. Figure 1 shows the budget that was allocated for the execution of the agreement. The total of more than CHF14 million engaged corresponds approximately to 80–90 full-time equivalent years used by CERN to fulfil the agreement. Most personnel are CERN staff, in some cases recruited ad hoc, but fellows and associated personnel are also involved.

The examples in figure 2 show a few of the most important technical achievements. One of the key ingredients of the success of the CERN-ITER collaboration is that checks are done on deliverables, rather than on detailed accounting or time-sheet reporting. This has been possible because of the technical competence of both the management and the technical leaders of the various tasks, as well as of the personnel involved, on both sides. Goals and deliverables, even the most difficult ones, were evaluated correctly and reasonable resources allocated at the outset, with a fair balance and good appreciation of margins. This leads to the conclusion that – despite modern management guidelines – technical competence is not a nuisance: it can make the difference.

The post CERN and ITER cooperate appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-and-iter-co-operate/feed/ 1 Feature The LHC and ITER project share many technologies, providing a natural basis for collaboration. https://cerncourier.com/wp-content/uploads/2014/09/CCcer2_08_14.jpg
A lifetime in biophysics https://cerncourier.com/a/a-lifetime-in-biophysics/ https://cerncourier.com/a/a-lifetime-in-biophysics/#comments Tue, 26 Aug 2014 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/a-lifetime-in-biophysics/   Shake hands with Eleanor Blakely and you are only one handshake away from John Lawrence – a pioneer of nuclear medicine and brother of Ernest Lawrence, the Nobel-prize-winning inventor of the cyclotron, the first circular particle accelerator. In 1954 – the year that CERN was founded – John Lawrence began the first use of […]

The post A lifetime in biophysics appeared first on CERN Courier.

]]>
 

Shake hands with Eleanor Blakely and you are only one handshake away from John Lawrence – a pioneer of nuclear medicine and brother of Ernest Lawrence, the Nobel-prize-winning inventor of the cyclotron, the first circular particle accelerator. In 1954 – the year that CERN was founded – John Lawrence began the first use of proton beams from a cyclotron to treat patients with cancer. Twenty years later, as a newly fledged biophysicist, Blakely arrived at the medical laboratory that John had set up at what is now the Ernest Orlando Lawrence Berkeley National Laboratory. There she came to know John personally and was to become established as a leading expert in the use of ion beams for cancer therapy.

With ideas of becoming a biology teacher, Blakely went to the University of San Diego in 1965 to study biology and chemistry. While there, she spent a summer as an intern at Oak Ridge National Laboratory and developed an interest in radiation biology. Excelling in her studies, she was encouraged to move towards medicine after obtaining her BA in 1969. However, armed with a fellowship from the Atomic Energy Commission that allowed her to choose where to go next, she decided to join the group of Howard Ducoff, a leading expert in radiation biology at the University of Illinois, Urbana-Champaign. Because she was fascinated by basic biological mechanisms, Ducoff encouraged her to take up biophysics, a field so new that he told her that it was “whatever you want to make it”. A requirement of the fellowship was to spend time at a national laboratory, so Blakely was assigned a summer at Berkeley Laboratory, where she worked on NASA-funded studies of proton radiation on murine skin and subsequent changes in blood electrolytes, which led to a Masters’ degree in biophysics.

After gaining her PhD studying the natural radioresistance of cultured insect cells, Blakely joined the staff at Berkeley Lab in 1975, arriving soon after the Bevatron – the accelerator where the antiproton was discovered – had been linked up to the heavy-ion linear accelerator, the SuperHILAC. The combination, known as the Bevalac, could accelerate ions as heavy as uranium to high energies. Blakely joined the group led by Cornelius Tobias. His research included studies related to the effects of cosmic rays on the retina, for which he exposed his own eye to ion beams to confirm his explanation of why astronauts saw light flashes during space flight. “It was a spectacular beginning, seeing my boss getting his eye irradiated,” Blakely recalls. For her own work, Tobias showed her a theoretical plot of the stopping power versus range for the different ion beams available at Berkeley. Her task was to work out which would be the best beam for cancer therapy. “I had no idea how much work that was going to be,” she says, “and it is still not settled!”

Thirty years before Blakely arrived at Berkeley, Robert Wilson, later founding director of Fermilab, had been working there with Ernest Lawrence when he realized that because protons and heavier ions deposit most of their energy near the end of their range in matter – the famous “Bragg peak” – they offered the opportunity of treating deep-seated tumours while minimizing damage to surrounding tissue (CERN Courier December 2006 p17). Assigned the task of studying the biological effectiveness of a variety of particles and energies available from Berkeley’s accelerators, Blakely irradiated dishes of human cell cultures, working along increasing depths of the Bragg peak for the various beams under different conditions. In particular, by spreading the energy of the incident particles the team could broaden the Bragg peak from a few millimetres to several centimetres.

The studies revealed that for carbon and neon ions, in the region before the Bragg peak there was a clear difference in cell survival under aerobic (oxygen) or hypoxic (nitrogen) conditions, while in the Bragg peak the relative biological effectiveness, as measured by cell survival, was more independent of oxygen than for X-rays or γ rays (Blakely et al. 1979). This boded well for the use of these ions in treating tumours, because many tumour cells are resistant to radiation damage under hypoxic conditions. For argon and silicon, however, the survival curves in oxygen and nitrogen already indicated high cell killing and a reduced oxygen effect in the entrance region of the Bragg curve before the peak, indicating that at higher atomic number, these ions were already too damaging and did not afford the radioprotection of the particles with lower atomic number in the beam entrance. The work had important ramifications for the development of hadron therapy today: while Berkeley went on to use neon ions for treatments, therapy with carbon ions was to become of major importance, first in Japan and then in Europe (CERN Courier December 2011 p37).

At Berkeley, she was plunged into a world of physics. “I had to learn to talk to physicists,” she recalls. “I had only basic physics from school – I learnt a lot of particle physics.” And in common with many physicists, it is a desire to understand how things work that has driven Blakely’s research, with the added attraction of being able to help people. Her interest lies deep in the cell cycle and what happens to the DNA, for example, as a function of radiation exposure. While her work has been of great value in helping oncologists, it is the fundamental processes that fascinate her as “a bench-top scientist”, to use her own words. “I’m interested in the body’s feedback mechanisms,” she explains.

That does not reduce her humanity. Some of the treatments at Berkeley used a beam of helium ions directed through the lens to destroy tumours of the retina. Blakely was devastated to learn that although the tumour was destroyed, the patients developed cataracts – a late radiation effect of exposure to the lens adjacent to some retinal tumours, which required lens-replacement surgery. As a result, she not only helped to propose a more complex technique to irradiate the tumours by directing the beam though the sclera (the tough, white outer layer of the eye) instead of the lens, but also became interested in the effects of radiation on the lens of the eye – a field in which she is a leading expert.

In 1993, the Bevalac was shut down, leaving Blakely and her colleagues at Berkeley without an accelerator with energies high enough for hadron therapy. “It was such an old machine,” she says. “Everyone had worked their hearts out to treat the patients.” The Bevalac had produced the heavier ion beams, while the 184-inch accelerator had produced beams of helium ions, and together almost 2500 cancer patients had been treated.

With her interest in irradiation of the eye, Blakely followed her first group leader “into space” – at least as a “bench-top” scientist – with studies of the effects of low radiation doses for the US space agency, NASA. “In space, people are exposed to chronic low doses of radiation,” she explains. In particular, she has been studying heavy-ion-induced tumourigenesis in mice with a broad gene pool similar to humans, to evaluate any risks in space travel.

Given that hadron therapy began 60 years ago at Berkeley, it is striking that nowadays there are no treatment centres in the US that use nuclei any heavier than the single protons of hydrogen. Japan was the first country to have a heavy-ion accelerator built for medical purposes – the Heavy Ion Medical Accelerator in Chiba (HIMAC) that started in 1994 (CERN Courier July/August 2007 p17 and June 2010 p22). During the last 10 years, Europe has followed suit, with the Heidelberg Ion-Beam Therapy Centre in Germany, and the Centro Nazionale di Adroterapia Oncologica in Italy using carbon-ion beams on an increasing number of patients (CERN Courier December 2011 p37). Another new centre, MedAustron in Austria, is now reaching the commissioning phase (CERN Courier October 2011 p33). Blakely describes the situation in her homeland as “a tragedy – the technology emerged from the US but we don’t have the machines”. Part of the problem lies with the country’s health-care plan, she says. “The treatments are not yet reimbursable, and the government won’t support building machines.”

Nevertheless, there is a glimmer of hope, following a workshop on ion-beam therapy organized by the US Department of Energy and the National Cancer Institute in Bethesda in January 2013, with participants from medicine, physics, engineering and biology. P20 Exploratory Planning Grants for a National Center for Particle Beam Radiation Therapy Research in the US are now pending. “Sadly this doesn’t give us money to build a machine – legally the government isn’t allowed to do that – but the P20 can provide for infrastructure, research and networking once you have a machine,” Blakely explains. However, there is support for patients from the US to take part in randomized clinical trials – the “gold standard” for determining the best modality for treating a patient. At the same time, she envies the networking and other achievements of the European Network for Light Ion Hadron Therapy (ENLIGHT), co-ordinated at CERN, which promotes international R&D, networking and training (CERN Courier December 2012 p19). “Networking is really import but it wasn’t something they taught us at school,” she says, “and training for students and staff is essential for the use of hadron therapy to have a future….The many programmes that have been developed [by ENLIGHT] are extremely important and valuable, and I wish we had them in the US.”

Looking back on a career that spans 40 years, Blakely says: “It has been fulfilling, but a lot of work.” And what aspect is she most proud of? “Probably the paper from 1979,” she answers, “the result of many nights working at the accelerator.” When the focal point of hadron therapy moved to Japan, researchers there repeated her work. “They found the data were exactly reproducible,” she says with clear pleasure. Would she recommend the same work to a young person today? “With the current funding situation in the US,” she says, “I tell people that you have to love it more than eating – you need to be really committed.” Perhaps, one day, hadron therapy will return home, and the line of research begun by pioneers such as John Lawrence and Cornelius Tobias will inspire a new generation of people like Blakely.

The post A lifetime in biophysics appeared first on CERN Courier.

]]>
https://cerncourier.com/a/a-lifetime-in-biophysics/feed/ 6 Feature
OECD report praises innovation at CERN https://cerncourier.com/a/oecd-report-praises-innovation-at-cern/ https://cerncourier.com/a/oecd-report-praises-innovation-at-cern/#respond Wed, 23 Jul 2014 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/oecd-report-praises-innovation-at-cern/ In early June, the Organisation for Economic Co-operation and Development (OECD) published their Global Science Forum (GSF) report, “The Impacts of Large Research Infrastructures on Economic Innovation and on Society: Case studies at CERN”. The report praises the culture of innovation at CERN, and finds that the laboratory has “evident links to economic, political, educational […]

The post OECD report praises innovation at CERN appeared first on CERN Courier.

]]>
In early June, the Organisation for Economic Co-operation and Development (OECD) published their Global Science Forum (GSF) report, “The Impacts of Large Research Infrastructures on Economic Innovation and on Society: Case studies at CERN”. The report praises the culture of innovation at CERN, and finds that the laboratory has “evident links to economic, political, educational and social advances of the past half-century”.

Through in-depth, confidential interviews with the people involved directly, the report focuses on two of CERN’s projects: the development of superconducting dipole magnets for the LHC and the organization’s contribution to hadron therapy.

As many as 1232 superconducting dipoles – each 14 m long and weighing 35 tonnes – steer the particle beams in the LHC. Following the R&D phase in the years 1985–2001, a call to tender was issued for the series production of the dipoles. R&D had included building a proof-of-concept prototype, meeting the considerable challenge of designing superconducting cables made of niobium-titanium (NbTi), and designing a complex cryostat system to keep the magnets cold enough to operate under superconducting conditions (CERN Courier October 2006 p28).

The report notes that although innovation at the cutting edge of technology is “inherently difficult, costly, time consuming and risky”, CERN mitigated those risks by keeping direct responsibility, decision-making and control for the project. While almost all of the “intellectual added value” from the project stemmed from CERN, contractors interviewed for the study reported their experience with the organization to be positive. CERN’s flexibility and ability to innovate attracts creative, ambitious individuals, such that “success breeds success in innovation”, note the report’s authors.

The second case study covered CERN’s contribution to hadron therapy using beams of protons, or heavier nuclei such as carbon, to kill tumours. The authors attribute CERN’s success in pushing through medical research to its relatively “flat” hierarchy, where students and junior members of staff can share ideas freely with heads of department or management. A key project was the three-year Proton Ion Medical Machine Study, which started in 1996 and submitted a complete accelerator-system design in 1999 (CERN CourierOctober 1998 p20). CERN’s involvement in hadron therapy is also a story of collaboration – the laboratory retains close links with CNAO, the National Centre for Oncological Hadron Therapy in Italy, and the MedAustron centre in Austria and others (CERN Courier December 2011 p37).

The report also praises the longevity of CERN, which allows it to “recyle” its infrastructure for new projects, and the CERN staff. This manpower is described as a “great asset” for the organization, which can be deployed in response to strategic “top down” decisions or in response to initiatives that arise in a “bottom up” mode.

• For the full report, see www.oecd.org/sti/sci-tech/CERN-case-studies.pdf.

The post OECD report praises innovation at CERN appeared first on CERN Courier.

]]>
https://cerncourier.com/a/oecd-report-praises-innovation-at-cern/feed/ 0 News
ESS: neutron beams at the high-intensity frontier https://cerncourier.com/a/ess-neutron-beams-at-the-high-intensity-frontier/ https://cerncourier.com/a/ess-neutron-beams-at-the-high-intensity-frontier/#respond Thu, 22 May 2014 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/ess-neutron-beams-at-the-high-intensity-frontier/ A look at what will be the world’s most powerful neutron source.

The post ESS: neutron beams at the high-intensity frontier appeared first on CERN Courier.

]]>

Today, neutron research takes place either at nuclear reactors or at accelerator-based sources. For a long time, reactors have been the most powerful sources in terms of integrated neutron flux. Nevertheless, accelerator-based sources, which usually have a pulsed structure (SINQ at PSI being a notable exception), can provide a peak flux during the pulse that is much higher than at a reactor. The European Spallation Source (ESS) – currently under construction in Lund – will be based on a proton linac that is powerful enough to give a higher integrated useful flux than any research reactor. It will be the world’s most powerful facility for research using neutron beams, when it comes into full operation early in the next decade. Although driven by the neutron-scattering community, the project will also offer the opportunity for experiments in fundamental physics, and there are plans to use the huge amount of neutrinos produced at the spallation target for neutrino physics.

The story of the ESS goes back to the early 1990s, with a proposal for a 10 MW linear accelerator, a double compressor ring and two target stations. The aim was for an H linac to deliver alternate pulses to a long-pulse target station and to the compressor rings. The long-pulse target was to receive 2-ms long pulses from the linac, while multiturn injection into the rings would provide a compression factor of 800 and allow a single turn of 1.4 μs to be extracted to the short-pulse target station.

This proposal was not funded, however, and after a short hiatus, new initiatives to build the ESS appeared in several European countries. By 2009, three candidates remained: Hungary (Debrecen), Spain (Bilbao) and Scandinavia (Lund). The decision to locate the ESS near Lund was taken in Brussels in May 2009, after a competitive process facilitated by the European Strategy Forum for Research Infrastructures and the Czech Republic’s Ministry of Research during its period of presidency of the European Union. In this new incarnation, the proposal was to build a facility with a single long-pulse target powered by a 5 MW superconducting proton linac (figure 1). The neutrons will be released from a rotating tungsten target hit by 2 GeV protons emerging from this superconducting linac, with its unprecedented average beam power.

Neutrons have properties that make them indispensable as tools in modern research. They have wavelengths and energies such that objects can be studied with a spatial resolution between 10–10 m and 10–2 m, and with a time resolution between 10–12 s and 1 s. These length- and time-scales are relevant for dynamic processes in bio-molecules, pharmaceuticals, polymers, catalysts and many types of condensed matter. In addition, neutrons interact quite weakly with matter, so they can penetrate large objects, allowing the study of materials surrounded by vacuum chambers, cryostats, magnets or other experimental equipment. Moreover, in contrast to the scattering of light, neutrons interact with atomic nuclei, so that neutron scattering is sensitive to isotope effects. As an extra bonus, neutrons also have a magnetic moment, which makes them a unique probe for investigations of magnetism.

Neutron scattering also has limitations. One of these is that neutron sources are weak compared with sources of light or of electrons. Neutrons are not created, but are “mined” from atomic nuclei where they are tightly bound, and it costs a significant amount of energy to extract them. Photons, on the other hand, can be created in large amounts, for instance in synchrotron light sources. Experiments at light sources can therefore be more sensitive in many respects than those at a neutron source. For this reason, the siting of ESS next to MAX IV – the next-generation synchrotron radiation facility currently being built on the north-eastern outskirts of Lund – is important. Thanks to its pioneering magnet technology, MAX IV will be able to produce light with higher brilliance than at any other synchrotron light source, while the ESS will be the most powerful neutron source in the world.

The ESS will provide unique opportunities for experiments in fundamental neutron physics that require the highest possible integrated neutron flux. A particularly notable example is the proposed search for neutron–antineutron oscillations. The high neutron intensity at the ESS will allow sufficient precision to make neutron experiments complementary to efforts in particle physics at the highest energies, for example at the LHC. The importance of the low-energy, precision “frontier” has been recognized widely (Raidal et al. 2008 and Hewett et al. 2012), and an increasing number of theoretical studies have exploited this complementarity and highlighted the need for further, more precise experimental input (Cigliano and Ramsey-Musolf 2013).

In addition, the construction of a proton accelerator at the high-intensity frontier opens possibilities for investigations of neutrino oscillations. A collaboration is being formed by Tord Ekelöf and Marcos Dracos to study a measurement of CP violation in neutrinos using the ESS together with a large underground water Cherenkov detector (Baussen et al. 2013).

The main components

The number of neutrons produced at the tungsten target will be proportional to the beam current, and because the total production cross-section in the range of proton energies relevant for the ESS is approximately linear with energy, the total flux of neutrons from the target is nearly proportional to the beam power. Given a power of 5 MW, beam parameters have been optimized with respect to cost and reliability, while user requirements have dictated the pulse structure. Table 1 shows the resulting top-level parameters for the accelerator.

The linac will have a normal-conducting front end, followed by three families of superconducting cavities, before a high-energy beam transport brings the protons to the spallation target. Because the ESS is a long-pulse source, it can use protons rather than the H ions needed for efficient injection into the accumulator ring of a short-pulse source.

Figure 2 illustrates the different sections of the linac. In addition to the ion source on a 75 kV platform, the front end consists of a low-energy beam transport (LEBT), a radio-frequency quadrupole that accelerates to 3.6 MeV, a medium-energy beam transport (MEBT) and a drift-tube linac (DTL) that takes the beam to 90 MeV.

The superconducting linac, operating with superfluid helium at 2 K, starts with a section of double-spoke cavities having an optimum beta of 0.50. The protons are accelerated to 216 MeV in 13 cryomodules, each of which has two double-spoke cavities. Medium- and high-beta elliptical cavities follow, with geometric beta values of 0.67 and 0.92. The medium-beta cavities have six cells, the high-betas have five cells. In this way, the two cavity types have almost the same length, so that cryomodules of the same overall design can be used in both cases to house four cavities. Figure 3 shows a preliminary design of a high-beta cryomodule, with its four five-cell cavities and power couplers extending downwards.

Nine medium-beta cryomodules accelerate the beam to 516 MeV, and the final 2 GeV is reached with 21 high-beta modules. The normal-conducting acceleration structures and the spoke cavities run at 352.21 MHz, while the elliptical cavities operate at twice the frequency, 704.42 MHz. After reaching their full energy, the protons are brought to the target by the high-energy beam transport (HEBT), which includes rastering magnets that produce a 160 × 60 mm rectangular footprint on the target wheel.

The design of the proton accelerator – as with the other components of the ESS – has been carried out by a European collaboration. The ion source and LEBT have been designed by INFN Catania, the RFQ by CEA Saclay, the MEBT by ESS-Bilbao, the DTL by INFN Legnaro, the spoke section by IPN Orsay, the elliptical sections again by CEA Saclay, and the HEBT by ISA Århus. During the design phase, additional collaboration partners included the universities of Uppsala, Lund and Huddersfield, NCBJ Świerk, DESY and CERN. Now the collaboration is being extended further for the construction phase.

A major cost driver of the ESS accelerator centres on the RF sources. Klystrons provide the standard solution for high output power at the frequencies relevant to the ESS. For the lower power of the spoke cavities, tetrodes are an option, but solid-state amplifiers have not been excluded completely, even though the required peak powers have not been demonstrated yet. Inductive output tubes (IOTs) are an interesting option for the elliptical cavities, in particular for the high-beta cavities, where the staged installation of the linac still allows for a few years of studies. While IOTs are more efficient and take up less space than klystrons, they are not yet available for the peak powers required, but the ESS is funding the development of higher-power IOTs in industry.

Neutron production

The ESS will use a rotating, gas-cooled tungsten target rather than, for instance, the liquid-mercury targets used at the Spallation Neutron Source in the US and in the neutron source at the Japan Proton Accelerator Research Complex. As well as avoiding environmental issues that arise with mercury, the rotating tungsten target will require the least amount of development effort. It also has good thermal and mechanical properties, excellent safety characteristics and high neutron production.

The target wheel has a diameter of 2.5 m and consists of tungsten elements in a steel frame (figure 4). The tungsten elements are separated by cooling channels for the helium gas. The wheel rotates at 25 rpm synchronized with the beam pulses, so that consecutive pulses hit adjacent tungsten elements. An important design criterion is that the heat generated by radioactive decay after the beam has been switched off must not damage the target, even if all active cooling systems fail.

With the ESS beam parameters, every proton generates about 80 neutrons. Most of them are emitted with energies of millions of electron volts, while most experiments need cold neutrons, from room temperature down to some tens of kelvins. For this reason, the neutrons are slowed down in moderators containing water at room temperature and super-critical hydrogen at 13–20 K before being guided to the experimental stations, which are known as instruments. The construction budget contains 22 such instruments, including one devoted to fundamental physics with neutrons.

The ESS is an international European collaboration where 17 European countries (Sweden, Denmark, Norway, Iceland, Estonia, Latvia, Lithuania, Poland, Germany, France, the UK, the Netherlands, the Czech Republic, Hungary, Switzerland, Italy and Spain) have signed letters of intent. Negotiations are now taking place to distribute the costs between these countries.

The project has now moved into the construction phase, with ground breaking planned for summer this year.

Sweden and Denmark have been hosting the ESS since the site decision, and a large fraction of the design study that started then was financed by Sweden and Denmark. The project has now moved into the construction phase, with ground breaking planned for summer this year.

According to the current project plans, the accelerator up to and including the medium-beta section will be ready by the middle of 2019. Then, the first protons will be sent to the target and the first neutrons will reach the instruments. During the following few years, the high-beta cryomodules will be installed, such that the full 5 MW beam power will be reached in 2022.

The neutron instruments will be built in parallel. Around 40 concepts are being developed at different laboratories in Europe, and the 22 instruments of the complete ESS project will be chosen in a peer-reviewed selection process. Three of these will have been installed in time for the first neutrons. The rest will gradually come on line during the following years, so that all will have been installed by 2025.

The construction budget of ESS amounts to €1,843 million, half of which comes from Sweden, Denmark and Norway. The annual operating costs are estimated to be €140 million, and the cost for decommissioning the ESS after 40 years has been included in the budget. The hope, however, is that the scientific environment that will grow up around ESS and MAX IV – and within the Science Village Scandinavia to be located in the same area – will last longer than that.

The post ESS: neutron beams at the high-intensity frontier appeared first on CERN Courier.

]]>
https://cerncourier.com/a/ess-neutron-beams-at-the-high-intensity-frontier/feed/ 0 Feature A look at what will be the world’s most powerful neutron source. https://cerncourier.com/wp-content/uploads/2014/05/CCess1_05_14.jpg
A network for life https://cerncourier.com/a/a-network-for-life/ https://cerncourier.com/a/a-network-for-life/#respond Wed, 30 Apr 2014 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/a-network-for-life/   The Particle Training Network for European Radiotherapy (PARTNER) was established in 2008 to train young biologists, engineers, radio-oncologists and physicists in the various aspects of hadron therapy. This deceptively simple statement hides a vision that was truly innovative when the project started: to offer a multidisciplinary education in this cutting-edge discipline to train a […]

The post A network for life appeared first on CERN Courier.

]]>
 

The Particle Training Network for European Radiotherapy (PARTNER) was established in 2008 to train young biologists, engineers, radio-oncologists and physicists in the various aspects of hadron therapy. This deceptively simple statement hides a vision that was truly innovative when the project started: to offer a multidisciplinary education in this cutting-edge discipline to train a future generation of experts who would be aware of the different scientific and technological challenges and move the field forward. PARTNER went on to provide research and training opportunities for 29 young scientists from a variety of backgrounds and countries, between 2008 and 2012. The publication of selected papers from PARTNER in the Journal of Radiation Research offers the opportunity to assess the research outcomes of the project.

As a Marie Curie Initial Training Network (ITN) within the European Union’s 7th Framework Programme (FP7), PARTNER was naturally focused on education, with a training programme encompassing science, technology and transferable skills (CERN Courier March 2010 p27). At the same time, the young scientists became engaged in research on a variety of topics from radiobiology to motion monitoring techniques, dosimetry, accelerators, computing and software tools. All of the research projects shared a focus on the impacts of clinical application, and many brought significant advances to the field.

Ingenious technologies

A key technology area is the development of affordable hadron-therapy installations. The next generation of accelerators should be smaller and less expensive. At the same time, they should allow fast, active energy modulation and have a high repetition rate, so that moving organs can be treated appropriately in reasonable time. PARTNER contributed to the design for the CArbon BOoster for Therapy in Oncology (CABOTO) – a compact, efficient high-frequency linac to accelerate C6+ ions and H2+ molecules from 150 to 410 MeV/u in about 24 m.

Gantries – the magnetic structures that bring particle beams onto the patient at the desired angle – are a major issue in the construction of carbon-ion facilities. The only existing carbon-ion gantry is installed at the Heidelberg Ion-Beam Therapy Center (HIT). It is a fixed, isocentric gantry 6.5 m tall and 25 m long, with a total weight of 600 tonnes. A design study supported by PARTNER and the FP7 project ULICE (CERN Courier December 2011 p37) proposed an innovative solution based on both the gantry and the treatment room being mobile. The isocentric gantry consists of a 90° bending dipole that rotates around the axis of the beam entrance, while the treatment room can move ±90°, thanks to an arrangement that keeps the floor of the room horizontal – like the cabin in a panoramic wheel (see figure). This design reduces the weight and dimensions of the gantry greatly and hence the overall cost.

Clever solutions are also needed to ensure the correct positioning of the patient for treatment. This is particularly important in the case of tumours that change position as organs move – for example, when the patient breathes. A standard technique to reposition the patient accurately at each treatment session involves the implantation of radiographically visible fiducial markers. These markers must not introduce imaging artefacts or perturb the dose delivery process. In particle therapy, however, the interaction of the therapeutic beam with the markers can have a significant impact on the treatment. In this context, PARTNER conducted a study at the treatment set-up at HIT to compare a range of commercially available markers of different materials, shapes and sizes. Some of the markers offered promising results and will soon be used in clinical routine, but the study highlighted that markers should be chosen carefully, taking into account both the tumour localization and the irradiation strategy.

The combination of image guidance with a mask-immobilization system was also investigated at HIT on patients with head-and-neck, brain and skull-base tumours. The study demonstrated that, for the same immobilization device, different imaging verification protocols translate into important differences in accuracy.

At the National Centre for Oncological Treatment (CNAO) in Pavia, PARTNER researchers carried out a comparative analysis of in-room imaging versus an optical tracking system (OTS) for patient positioning. The results showed that while the OTS cannot replace the in-room imaging devices fully, the preliminary OTS correction can greatly support the refinement of the patient set-up based on images, and provide a secondary, independent verification system for patient positioning.

State-of-the-art techniques are also needed for treatment planning – the tool that allows medical physicists to translate the dose prescribed by the oncologists into the set-up parameters for the beam. A PARTNER research project developed a novel Monte Carlo treatment-planning tool for hadron therapy, suitable for treatments delivered with the pencil-beam scanning technique. The tool allows the set-up of single and multiple fields to be optimized for realistic conditions for patient treatment, and also allows dosimetric quality assurance to be performed. Another study led to an accurate parameterization of the lateral dose spread for scanned proton and carbon-ion beams, which is currently in clinical use at HIT and CNAO.

Set-up errors and organ motion can influence the dose distribution during a treatment session. To deal with these potential variations, additional margins are applied to the tumour target, forming the so-called planning target volume (PTV). This procedure ensures that the tumour is irradiated entirely, but inevitably increases the dose delivered to the surrounding healthy tissues. PARTNER researchers studied the generation of a patient-specific PTV from multiple images and were able to achieve satisfactory control of possible target variations, with no significant increase in the dose delivered to organs at risk.

The great attraction of hadron therapy is the possibility of a precisely tailored dose distribution, which allows tumour cells to be hit while sparing the healthy tissues. Sophisticated measurements are needed to verify the actual dose delivered in a specific beam set-up, and air-filled ionization chambers are extensively used in this context. The conversion of data from the ionization chambers into standard dosimetric quantities employs a quality factor that accounts for the specificity of the beam. The ratio of water-to-air stopping power is one of the main components of this quality factor and – in the case of carbon-ion beams – its biggest source of uncertainty. PARTNER researchers developed a fast computational method to determine this stopping-power ratio, with results that were in good agreement with full Monte Carlo calculations.

Faster calculation methods are essential to re-compute the treatment plan quickly when needed, but they should not reduce the accuracy of the treatment planning. The PARTNER studies also demonstrated that a chamber-specific correction could be implemented in the treatment planning, bringing a small improvement to the overall accuracy of the verification of the plan.

Combining treatment modalities has become a standard approach in oncology, and it is important to understand how hadron therapy can fit into these combined treatment schemes. Within the PARTNER framework, three emerging treatment modalities were compared: volumetric-modulated arc therapy (VMAT), intensity- modulated proton beam therapy (IMPT) and intensity-modulated carbon-ion beam therapy (IMIT). Their combinations were also evaluated. The results clearly showed a better dose distribution in the case of combined treatments, but their actual clinical benefit remains to be demonstrated.

Biological factors

In the biological field, studies were performed to understand better the impact of hypoxia – oxygen deprivation – on cell survival, for various types of radiation therapy. Hypoxia is well known as one of the major reasons for the resistance of tumour cells to radiation. It also enhances the risk of metastatic formations. Understanding radioresistance is a key factor for more effective cancer therapy that will minimize local recurrences. Different levels of oxygen deprivation were studied, from intermediate hypoxia to total oxygen deprivation or anoxia. Cells irradiated under chronic anoxia turned out to be more sensitive to radiation than those under acute anoxia. Measurements also suggested that ions heavier than carbon could bring additional advantages in therapeutic irradiation, in particular for radioresistant hypoxic tumour regions.

The initial clinical experience at the CNAO facility provided the opportunity to study toxicity and quality of life for patients under the protocols approved by the Italian Health Ministry, namely for chordoma and chondrosarcoma. The preliminary results showed that all patients completed their treatment with no major toxicities and without interruptions, and that proton therapy did not affect their quality of life adversely. The assessment of quality of life in patients with these tumours is so far unique, as no other study of this kind has been published.

Side effects such as toxicity are an integral part of the information that determines the appropriate choice of treatment. Realistic, long-term data on such effects are difficult to obtain, mainly because of the limited duration of medical studies, so decision-making processes in medicine rely increasingly on modelling and simulation techniques. One of the PARTNER research projects focused on the implementation of a general Markov model for the analysis of side effects in radiotherapy, and developed a specific language to encode the medical understanding of a disease in computable definitions. The proposed method has the potential to automate the generation of Markov models from existing data and to be applicable to many similar decision problems.

Making optimal use of the available resources is a major challenge for the hadron-therapy community, with secure data sharing at the heart of the problem. The Hadron therapy Information Sharing Prototype (HISP) was developed within PARTNER to provide a gateway to patient information that is distributed in many hospital databases, and to support patient follow-up in multicentre clinical studies. HISP demonstrates a range of different and important features, and uses open-source software components that are important for the platform’s sustainable extension and potential for adoption.

The PARTNER network made important contributions to key research areas connected to hadron therapy, geared towards the optimization of this option for cancer treatment. A unique multidisciplinary training portfolio allowed more than 90% of the PARTNER scientists to find positions soon after the end of the project, thanks also to the expertise acquired at the most advanced European hadron-therapy centres and to the networking opportunities provided by the ITN. The medical doctors from India and Singapore went back to their countries and hospitals, while most of the other researchers are now working in hadron-therapy facilities in Europe, the US and Japan. The specific goal of training experts for upcoming and operational facilities was therefore successfully met, and the researchers ensure that the network lives on, wherever they are in the world.

• The PARTNER project was funded by the European Commission within the FP7 People (Marie Curie) Programme, under Grant Agreement No 215840.

The post A network for life appeared first on CERN Courier.

]]>
https://cerncourier.com/a/a-network-for-life/feed/ 0 Feature
New results mark progress towards polarized ion beams in laser-induced acceleration https://cerncourier.com/a/new-results-mark-progress-towards-polarized-ion-beams-in-laser-induced-acceleration/ https://cerncourier.com/a/new-results-mark-progress-towards-polarized-ion-beams-in-laser-induced-acceleration/#respond Fri, 28 Mar 2014 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/new-results-mark-progress-towards-polarized-ion-beams-in-laser-induced-acceleration/ The field of laser-induced relativistic plasmas and, in particular, laser-driven particle acceleration, has undergone impressive progress in recent years.

The post New results mark progress towards polarized ion beams in laser-induced acceleration appeared first on CERN Courier.

]]>
CCnew13_03_14

The field of laser-induced relativistic plasmas and, in particular, laser-driven particle acceleration, has undergone impressive progress in recent years. Despite many advances in understanding fundamental physical phenomena, one unexplored issue is how the particle spins are influenced by the huge magnetic fields inherently present in the plasmas.

Laser-induced generation of polarized-ion beams would without doubt be important in research at particle accelerators. In this context, 3He2+ ions have been discussed widely. They can serve as a substitute for polarized neutron beams, because in a 3He nucleus the two protons have opposite spin directions, so the spin of the nucleus is carried by the neutron. However, such beams are currently not available owing to a lack of corresponding ion sources. A promising approach for a laser-based ion source would be to use pre-polarized 3He gas as the target material. Polarization conservation of 3He ions in plasmas is also crucial for the feasibility of proposals aiming at an increase in efficiency of fusion reactors by using polarized fuel, because this efficiency depends strongly on the cross-section of the fusion reactions.

CCnew14_03_14

A group from Forschungszentrum Jülich (FZJ) and Heinrich-Heine University Düsseldorf has developed a method to measure the degree of polarization for laser-accelerated proton and ion beams. In a first experiment at the Arcturus Laser facility, protons of a few million electron volts – generated most easily by using thin foil targets – were used to measure the differential cross-section d2σ/dϑdφ of the Si(p, p´)Si reaction in a secondary scattering target. The result for the dependence on scattering angle is in excellent agreement with existing data, demonstrating the feasibility of a classical accelerator measurement with a laser-driven particle source.

The azimuthal-angle (φ) dependence of the scattering distributions allowed the degree of polarization of the laser-accelerated protons to be determined for the first time. As expected from computer simulations for the given target configuration, the data are consistent with an unpolarized beam. This “negative” result indicates that the particle spins are not affected by the strong magnetic fields and field gradients in the plasma. This is promising for future measurements using pre-polarized targets, which are underway at Arcturus.

The polarization measurements are also an important step towards JuSPARC, the Jülich Short Pulse Particle and Radiation Centre at FZJ. This proposed laser facility will provide not only polarized beams but also intense X-ray and thermal neutron pulses to users from different fields of fundamental and applied research.

The post New results mark progress towards polarized ion beams in laser-induced acceleration appeared first on CERN Courier.

]]>
https://cerncourier.com/a/new-results-mark-progress-towards-polarized-ion-beams-in-laser-induced-acceleration/feed/ 0 News The field of laser-induced relativistic plasmas and, in particular, laser-driven particle acceleration, has undergone impressive progress in recent years. https://cerncourier.com/wp-content/uploads/2014/03/CCnew13_03_14-635x134-feature.jpg
ICTR-PHE: uniting physics, medicine and biology https://cerncourier.com/a/ictr-phe-uniting-physics-medicine-and-biology/ https://cerncourier.com/a/ictr-phe-uniting-physics-medicine-and-biology/#respond Fri, 28 Mar 2014 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/ictr-phe-uniting-physics-medicine-and-biology/ Résumé ICTR-PHE : unir la physique, la médecine et la biologie La conférence ICTR-PHE 2014 a réuni à Genève quelque 400 participants du monde entier venus discuter des dernières techniques de lutte contre le cancer. Chercheurs et praticiens de nombreuses disciplines ont passé en revue les dernières avancées de la recherche translationnelle, grâce à laquelle les […]

The post ICTR-PHE: uniting physics, medicine and biology appeared first on CERN Courier.

]]>
Résumé

ICTR-PHE : unir la physique, la médecine et la biologie

La conférence ICTR-PHE 2014 a réuni à Genève quelque 400 participants du monde entier venus discuter des dernières techniques de lutte contre le cancer. Chercheurs et praticiens de nombreuses disciplines ont passé en revue les dernières avancées de la recherche translationnelle, grâce à laquelle les innovations de la recherche fondamentale trouvent des applications dans le domaine de la santé (physique, biologie ou oncologie clinique). Pour tous les spécialistes concernés, la conférence ICTR-PHE est l’endroit idéal pour faire le point des travaux réalisés jusque-là et définir les prochaines étapes afin de maintenir la dynamique.

Physicists, biologists, physicians, chemists, nuclear-medicine experts, radio-oncologists, engineers and software developers – researchers and practitioners from many disciplines came to Geneva on 10–14 February for ICTR-PHE 2014, which brought together for the second time the International Conference on Translational Research in Radio-Oncology and Physics for Health in Europe. The joint conference aims to unite physics, biology and medicine for better healthcare, and the goal of this second meeting was to review the most recent advances in translational research – where developments in basic research are “translated” into means for improving health – in physics, biology and clinical oncology.

The conference featured the many advances that have occurred during the past two years since the first joint conference. The resolution and precision of medical imaging is continuing to grow with the use of combined modalities, such as positron-emission tomography (PET) with computed tomography (CT), and PET with magnetic resonance imaging (MRI) – an important technical breakthrough. Biologists and chemists are performing studies to develop new radiation carriers – including antibodies and nanoparticles – to target tumours. The Centro Nazionale di Adroterapia Oncologica (CNAO) in Italy has started hadron therapy with proton beams and carbon-ion beams, obtaining the necessary certification labels for both treatments. Another new centre, MedAustron in Austria, is being built and is reaching the commissioning phase. Moreover, while the use of proton therapy continues to grow around the world, the Japanese centres and the Heidelberg Ion-Beam Therapy Centre in Germany are using carbon-ion therapy on an increasing number of patients. For all of the experts involved in such a variety of different fields, the ICTR-PHE conference was the ideal place to take stock of the work done so far, and to define the next steps that the community should take to keep the momentum high.

Although the first patient was treated with protons 60 years ago in Berkeley, the field has not yet implemented all of the phases of the clinical trials required for evidence-based medicine and the national health systems. In particular, several experts discussed the need to perform randomized trials. This, of course, comes with unavoidable ethical issues and methodological concerns. The community is geographically scattered and several important factors – such as the integrated dose that should be delivered, the fractionation and the types of tumours to be treated – are still being studied. On one hand, it is a hard task for the various scientists to define common protocols to be followed to perform the trials. On the other hand, physicians and patients might be sceptical towards new therapies that are not yet felt to be tested extensively. Despite the fact that every year several thousand patients are diagnosed using radiopharmaceuticals and subsequently treated with hadron therapy, the use of particles is still often considered with scepticism.

The situation is made even more complex by the fact that the fight against cancer is taking on a more personalized approach. Although highly beneficial to patients, this makes it difficult for doctors to apply the same treatment plan to a large number of people. Cancer is not really a single disease. Its many facets require different therapies for different patients, depending on the specific type of malignant cell, the location of the tumour, its dimensions, etc. Several presentations at the conference focused on the important impact that such personalized treatment has in the disease’s prognosis.

In this respect, the challenge for today’s oncologists starts with high-quality imaging that allows them to define the active tumour volume as well as the possible metastasis in the body. Again, depending on the type of tumour, researchers can now select the best radiopharmaceutical that, once injected into the body and in conjunction with a detection modality such as PET, is able to identify the target cells precisely. Moreover, the same carrier molecules that are able to bring the radiating isotopes to the malignant cells and make them visible to the detecting instruments could be used with more powerful isotopes, to bring a lethal dose into the tumour volume directly. Some of the most recent studies involve the use of specific peptides associated with isotopes obtained at particle accelerators. Others involve innovative nanoparticles as vehicles to bring radiation into the target. Each single solution implies the use of specific isotopes. At CERN, the MEDICIS project aims to produce isotopes for medical research. Although the project has only recently entered the construction phase, the collaboration between the MEDICIS team and specialized teams of radiobiologists and chemists has already begun.

Imaging has reached spatial resolutions down to 2 mm. The combination of various imaging techniques, such as PET/CT or PET/MRI, allows oncologists to gather information not only about the geometry of a tumour but also about its functionality. Further improvements could come from both better hardware and more sophisticated software and algorithms for integration of the information. Significant improvement to the hardware could be introduced by the time-of-flight technique – well known to particle physicists for its use in many high-energy experiments.

The best treatment

Once the oncologists have acquired the information about the malignant cells and tumour volume, as well as other important data about the patient, they can define the best treatment for a specific case. Computer simulations made with the GEANT4 and FLUKA software suites are used to define the most suitable treatment planning. These codes are in continuous development and are able to deliver increasingly precise information about the dose distribution. In addition to new advances in computer simulations, the ICTR-PHE conference also featured a presentation about the first 3D mapping over a known distance of the dose distribution along the whole path of a 62 MeV proton beam. These studies are extremely useful in the determination of collateral damage, including possible secondary tumours caused by particle beams.

Unwanted damage to healthy tissues is a key point when it comes to comparing conventional photon-radiation therapy with hadron therapy. Thanks to the intensity modulation and volumetric arc techniques, and image-guided treatments, today’s conventional radiation therapy has reached levels of effectiveness that challenge hadron therapy. Nevertheless, because of the specific way they deliver their energy (the well known Bragg peak), hadrons can target tumours much more precisely. Therefore, hadron beams are potentially much less dangerous to nearby healthy tissues. However, their overall biological impact is still to be evaluated precisely and the cost of the infrastructures is significantly higher than for widely used conventional radiation. The debate remains open, and a final word will only come once the various teams involved have carried out the necessary clinical trials. The importance of sharing information and data among all active partners was highlighted throughout the conference.

In general, the results presented at the conference were, in many cases, very promising. Not only has the knowledge of cancer increased hugely during recent years, in particular at the molecular level, but also – and even more importantly – a different awareness is gaining momentum within the various communities. As one of the plenary speakers emphasized, the idea that one single oncologist can effectively fight cancer should be abandoned. Instead, the collaboration among chemists, biologists, engineers, physicists and physicians should surely improve the prognosis and the end result.

The beneficial impact of such collaboration was particularly evident when the speakers presented results from the combination of various techniques, including surgery and chemotherapy. This is because several factors play a role in the response of malignant cells to radiation: drugs, of course, and also the patient’s immunology, the hypoxia (oxygen deprivation) rate and the inner nature of the tumour cells. Recent studies have shown, for example, that malignant cells infected by the HPV virus have a better response to radiation, which translates into a better prognosis.

The role played by hypoxia and the various ways to overcome it were popular topics. A particularly interesting talk emphasized the need to go a step further and, having already acquired a deep knowledge of hypoxia in the malignant tissues, proceed to treat it with drugs before starting any further therapies. This is not yet the case in the current protocols, despite the many confirmations coming from research studies.

Indeed, the time needed for a new medical advance developed by scientists to reach the patient is a key issue. In this respect, the ICTR-PHE conference has a unique role. Medical doctors can learn about the latest radio-pharmaceuticals, the latest imaging instruments and the latest therapies that other scientists have worked on. At the same time, physicists, specialized industry, radiobiologists, etc, can hear from the medical field where they should concentrate their efforts for future research.

The impression was that the community is very willing to build a new collaboration model and that CERN could play an important role. The newly created CERN Office for Medical Applications is an example of the strength of the laboratory’s wish to contribute to the growth of the field. Medical doctors need cost-effective instruments that are easy to use and reliable over time. This presents a challenge for physicists, who will have to use the most advanced technologies to design new accelerator facilities to produce the hadron beams for patient treatment.

In addition to new accelerators, there is a plethora of opportunities for the physics field. These include the construction of a biomedical facility at CERN to provide particle beams of different types and energies for external users for radiobiology and detector development; the construction and testing of innovative detectors for beam control and medical imaging; the development of state-of-the-art instruments for accurate dosimetry; the MEDICIS facility for the production of rare radioisotopes; and a powerful computing grid for image treatment and storage.

As one of the speakers said, quoting the novelist William Gibson: “The future is here. It is just not evenly distributed yet.” This is the next challenge for the community of scientists who attended ICTR-PHE 2014 – to take all of these advances to the patients as quickly as possible.

Physics highlights

Even though the conference focused on translational research and medical applications of physics, it would have been impossible to ignore the discovery by the ATLAS and CMS experiments at the LHC of a Higgs boson – the particle linked to a mechanism that gives mass to many fundamental particles – and the subsequent award of the 2013 Nobel Prize in Physics to two of the theoreticians who proposed the mechanism. Fabiola Gianotti, former spokesperson of the ATLAS experiment at the LHC, opened the conference and captivated the audience with the tale of the many years of Higgs hunting by thousands of researchers across the world.

The role of physics and physicists was highlighted also by Ugo Amaldi in his public talk “Physics is beautiful and useful”. The father of the word “hadrontherapy” showed how, following the discovery of X-rays in 1895, fundamental physics, particle therapy and diagnostics became three intertwined yarns: the advances in one field have an impact on the other two. Amaldi concluded his much-appreciated talk by presenting an overview of possible future developments, including “Tulip” – a Turning Linac for Protontherapy – which is a new prototype that aims to supply protons with compact, less-expensive instrumentation.

The post ICTR-PHE: uniting physics, medicine and biology appeared first on CERN Courier.

]]>
https://cerncourier.com/a/ictr-phe-uniting-physics-medicine-and-biology/feed/ 0 Feature
Advanced radiation detectors in industry https://cerncourier.com/a/advanced-radiation-detectors-in-industry/ https://cerncourier.com/a/advanced-radiation-detectors-in-industry/#respond Fri, 28 Mar 2014 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/advanced-radiation-detectors-in-industry/   The European Physical Society’s Technology and Innovation Group (EPS-TIG) was set up in 2011 to work at the boundary between basic and applied sciences, with annual workshops organized in collaboration with CERN as its main workhorse (CERN Courier April 2013 p31). The second workshop, organized in conjunction with the department of physics and astronomy […]

The post Advanced radiation detectors in industry appeared first on CERN Courier.

]]>
 

The European Physical Society’s Technology and Innovation Group (EPS-TIG) was set up in 2011 to work at the boundary between basic and applied sciences, with annual workshops organized in collaboration with CERN as its main workhorse (CERN Courier April 2013 p31). The second workshop, organized in conjunction with the department of physics and astronomy and the “Fondazione Flaminia” of Bologna University, took place in Ravenna on 11–12 November 2013. The subject – advanced radiation detectors for industrial use – brought experts involved in the research and development of advanced sensors, together with representatives from related spin-off companies.

The first session, on technology-transfer topics, opened with a keynote speech by Karsten Buse, director of the Fraunhofer Institute for Physical Measurement Technique (IPM), Freiburg. In the spirit of Joseph von Fraunhofer (1787–1826) – a researcher, inventor and entrepreneur – the Fraunhofer Gesellschaft promotes innovation and applied research that is of direct use for industry. Outlining the IPM’s mission and the specific competences and services it provides, Buse presented an impressive overview of technology projects that have been initiated and developed or improved and supported by the institute. He also emphasized the need to build up and secure intellectual property, and explained contract matters. The success stories include the MP3 audio-compression algorithm, white LEDs to replace conventional light bulbs, and all-solid-state widely tunable lasers. Buse concluded by observing that bridging the gap between academia and industry requires some attention, but is less difficult than often thought and also highly rewarding. A lively discussion followed in the audience of students, researchers and partners from industry.

The second talk focused on knowledge transfer (KT) from the perspective of CERN’s KT Group. First, Giovanni Anelli described the KT activities based on CERN’s technology portfolio and on people – that is, students and fellows. In the second part, Manjit Dosanjh presented the organization’s successful and continued transfer to medical applications of advanced technologies in the fields of accelerators, detectors and informatics technologies. Catalysing and facilitating collaborations between medical doctors, physicists and engineers, CERN plays an important role in “physics for health” projects at the European level via conferences and networks such as ENLIGHT, set up to bring medical doctors and physics researchers together (CERN Courier December 2012 p19).

Andrea Vacchi of INFN/Trieste reviewed the INFN’s KT activities. He emphasized that awareness of the value of the technology assets developed inside INFN is growing. In the past, technology transfer between INFN and industry happened mostly through the involvement of suppliers in the development of technologies. In future, INFN will take more proactive measures to encourage technology transfer between INFN research institutions and industry.

From lab to industry

The first afternoon was rounded up by Colin Latimer of the University of Belfast and member of the EPS Executive Committee. He illustrated the varying timescales between invention and mass-application multi-billion-dollar markets, with a number of example technologies including optical fibres (1928), liquid-crystal displays (1936), magnetic-resonance imaging (MRI) scanners (1945) and lasers (1958), with high-temperature superconductors (1986) and graphene (2004) still waiting to make a major impact. Latimer went on to present results from the recent study commissioned by the EPS from the Centre for Economics and Business Research, which has shown the importance of physics to the European economy (EPS/Cebr 2013).

The second part of the workshop was devoted to sensors and innovation in instrumentation and industrial applications, starting with a series of talks that reviewed the latest developments. This was followed by presentations from industry on various sensor products, application markets and technological developments.

Erik Heijne, a pioneer of silicon and silicon-pixel detectors at CERN, started by discussing innovation in instrumentation through the use of microelectronics technology. Miniaturization to sub-micron silicon technologies allows many functions to be compacted into a small volume. This has led in turn to the integration of sensors and processing electronics in powerful devices, and has opened up new fields of applications (CERN Courier March 2014 p26). In high-energy particle physics, the new experiments at the LHC have been based on sophisticated chips that allow unprecedented event rates of up to 40 MHz. Some of the chips – or at least the underlying ideas – have found applications in materials analysis, medical imaging and other types of industrial equipment. The radiation imaging matrix, for example, based on silicon-pixel and integrated read-out chips, has many applications already.

Detector applications

Julia Jungmann of PSI emphasized the use of active pixel detectors for imaging in mass spectrometry in molecular pathology, in research done at the FOM Institute AMOLF in Amsterdam. The devices have promising features for fast, sensitive ion-imaging with time and space information from the same detector, high spatial resolution, direct imaging acquisition and highly parallel detection. The technique, which is based on the family of Medipix/Timepix devices, provides detailed information on molecular identity and localization – vital, for example in detecting the molecular basis of a pathology without the need to label bio-molecules. Applications include disease studies, drug-distribution studies and forensics. The wish list is now for chips with 100 ps time bins, a 1 ms measurement interval, multi-hit capabilities at the pixel level, higher read-out rates and high fluence tolerance.

In a similar vein, Alberto Del Guerra of the University of Pisa presented the technique of positron-emission tomography (PET) and its applications. Outlining the physics and technology of PET, he showed improved variants of PET systems and applications to molecular imaging, which also allow the visual representation, characterization and quantification of biological processes at the cellular and subcellular levels within living organisms. Clinical systems of hybrid PET and computerized tomography (CT) for application in oncology and neurology, human PET and micro-PET equipment, combined with small-animal CT, are available from industry, and today there are also systems where PET and magnetic resonance imaging (MRI) are combined. Such systems are being used in hadron therapy in Italy for monitoring purposes at the 62 MeV proton cyclotron of the CATANA facility in Catania, and at the proton and carbon synchrotron of the CNAO centre in Pavia. An optimized tri-modality imaging tool for schizophrenia is even being developed, combining PET with MRI and electroencephalography measurements. Del Guerra’s take-home message was that technology transfer in the medical field needs long-term investment – industry can withdraw halfway if a technology is not profitable (for example, Siemens in the case of proton therapy). In future, applications will be multimodal with PET combined with other imaging techniques (CT, MRI, optical projection tomography), for applications to specific organs such as the brain, breast, prostate and more.

The next topic related to recent developments in the silicon drift detector (SDD) and its applications. Chiara Guazzoni, of the Politecnico di Milano and INFN Milan, gave an excellent overview of SDDs, which were invented by Emilio Gatti and Pavel Rehak 30 years ago. These detectors are now widely used in X-ray spectroscopy and are commercially available. Conventional and non-conventional applications include the non-destructive analysis of cultural heritage and biomedical imaging based on X-ray fluorescence, proton-induced X-ray emission studies, gamma-ray imaging and spectroscopy, X-ray scatter imaging, etc. As Gatti and Rehak stated in their first patent, “additional objects and advantages of the invention will become apparent to those skilled in the art,” and Guazzoni hopes that the art will keep “drifting on” towards new horizons.

Moving on to presentations from industry and start-up companies, Jürgen Knobloch of KETEK GmbH in Munich presented new high-throughput, large-area SDDs, starting with a historical review of the work of Josef Kemmer, who in 1970 started to develop planar silicon technology for semiconductor detectors. Collaborating with Rehak and the Max-Planck Institute in Munich, Kemmer went on to produce the first SDDs with a homogeneous entrance window, with depleted field-effect transistor (DEPFET) and MOS-type DEPFET (DEPMOS) technologies. In 1989 he founded the start-up company KETEK, which is now the global commercial market leader in SSD technology. Knobloch presented the range of products from KETEK and concluded with a list of recommendations for better collaboration between research and industry. KETEK’s view on how science and industry can better collaborate includes: workshops of the kind organized by EPS-TIG; meetings between scientists and technology companies to set out practical needs and future requirements; involvement of technology-transfer offices to resolve intellectual-property issues; encouragement of industry to accept longer times for returns in investments; and the strengthening of synergies between basic research and industry R&D.

Knobloch’s colleague at KETEK, Werner Hartinger, then described new silicon photomultipliers (SiPMs) with high proton-detection efficiency, and listed the characteristics of a series of KETEK’s SiPM sensors, which also feature a huge gain (> 106) with low excess noise and a low temperature coefficient. KETEK has off-the-shelf SiPM devices and also customizes devices for CERN. The next steps will be continuous noise reduction (in both dark rate and cross-talk) by enhancing the KETEK “trench” technology, enhancement of the pulse shape and timing properties by optimizing parasitic elements and read-out, and the production of chip-size packages and arrays at the package level.

New start-ups

PIXIRAD, a new X-ray imaging system based on chromatic photon-counting technology, was presented by Ronaldo Bellazzini of PIXIRAD Imaging Counters srl – a recently constituted INFN spin-off company. The detector can deliver extremely clear and highly detailed X-ray images for medical, biological, industrial and scientific applications in the energy range 1–100 keV. Photon counting, colour mode and high spatial resolution lead to an optimal ratio of image quality to absorbed dose. Modules with units of 1, 2, 4 and 8 tiles have been built with almost zero dead space between the blocks. A complete X-ray camera based on the PIXIRAD-1 single-module assembly is available for customers in scientific and industrial markets for X-ray diffraction, micro-CT, etc. A dedicated machine to perform X-ray slot-scanning imaging has been designed and built and is currently under test. This system, which uses the PIXIRAD-8 module and is able to produce large-area images with fine position resolution, has been designed for digital mammography, which is one of the most demanding X-ray imaging applications.

CIVIDEC Instrumentation – another start-up company – was founded in 2009 by Erich Griesmayer. He presented several examples of applications of the products, which are based on diamond-detector technology. They have found use at the LHC and other accelerator beamlines as beam-loss and beam-position monitors for time measurements, high-radiation-level measurements, neutron time of flight, and as low-temperature detectors in superconducting quadrupoles. The company provides turn-key solutions that connect via the internet, supplying clients worldwide.

Nicola Tartoni, head of the detector group at the Diamond Light Source, outlined the layout of the facility and its diversified programmes. He presented an overview of the detector development and beamlines of this outstanding user facility in partnership with industry, with diverse R&D projects of increasing complexity.

Last, Carlos Granja, of the Institute of Experimental and Applied Physics (IEAP) at the Czech Technical University (CTU) in Prague, described the research carried out with the European Space Agency (ESA) demonstrating the impressive development in detection and particle tracking of individual radiation quanta in space. This has used the Timepix hybrid semiconductor pixel-detector developed by the Medipix collaboration at CERN. The Timepix-based space-qualified payload, produced by IEAP CTU in collaboration with the CSRC company of the Czech Republic, has been operating continuously on board ESA’s Proba-V satellite in low-Earth orbit at 820 km altitude, since being launched in May 2013. Highly miniaturized devices produced by IEAP CTU are also flying on board the International Space Station for the University of Houston and NASA for high-sensitivity quantum dosimetry of the space-station crew.

In other work, IEAP CTU has developed a micro-tracker particle telescope in which particle tracking and directional sensitivity are enhanced by the stacked layers of the Timepix device. For improved and wide-application radiation imaging, edgeless Timepix sensors developed at VTT and Advacam in Finland, with advanced read-out instrumentation and micrometre-precision tiling technology (available at IEAP CTU and the WIDEPIX spin-off company, of the Czech Republic), enable large sensitive areas up to 14 cm square to be covered by up to 100 Timepix sensors. This development allows the extension of high-resolution X-ray and neutron imaging at the micrometre level to a range of scientific and industrial applications.

• For more about the workshop, visit www.emrg.it/TIG_Workshop_2013/program.php?language=en. For the presentations, see http://indico.cern.ch/event/284070/.

The post Advanced radiation detectors in industry appeared first on CERN Courier.

]]>
https://cerncourier.com/a/advanced-radiation-detectors-in-industry/feed/ 0 Feature
ClearPEM clarifies breast cancer diagnosis https://cerncourier.com/a/clearpem-clarifies-breast-cancer-diagnosis/ https://cerncourier.com/a/clearpem-clarifies-breast-cancer-diagnosis/#respond Fri, 19 Jul 2013 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/clearpem-clarifies-breast-cancer-diagnosis/ Knowledge gained in developing particle detectors for the LHC has been used to create a dedicated PET device for breast scans.

The post ClearPEM clarifies breast cancer diagnosis appeared first on CERN Courier.

]]>
ClearPEM-Sonic

Breast cancer is the most frequent type of cancer among women and accounts for up to 23% of all cancer cases in female patients. The chance of a full recovery is high if the cancer is detected while it is still sufficiently small and has not had time to spread to other parts of the body. Routine breast-cancer screening is therefore part of health-care policies in many advanced countries. Conventional imaging techniques, such as X-ray, ultrasound or magnetic resonance imaging (MRI), rely on anatomical differences between healthy and cancerous tissue. For most patients, the information provided by these different modalities is sufficient to establish a clear diagnosis. For some patients, however, the examination will be inconclusive – for example, because their breast tissue is too dense to allow for a clear image – so these people will require further exams. Others may be diagnosed with a suspicious lesion that requires a biopsy for confirmation. Yet, once this biopsy is over, it might turn out to have been a false alarm.

Patients in this latter category can benefit from nuclear medicine. Positron-emission tomography (PET), for example, offers an entirely different approach to medical imaging by focusing on differences in the body’s metabolism. PET uses molecules involved in metabolic processes, which are labelled by a positron-emitting radioisotope. The molecule, once injected, is taken up in different proportions by healthy and cancerous cells. The emitted positrons annihilate with electrons in the surrounding atoms and produce a back-to-back pair of γ rays of 511 keV. The γ radiation is detected to reveal the distribution of the isotope in the patient’s body. However, whole-body PET suffers from a low spatial resolution of 5–10 mm for most machines, which is too coarse to allow for a precise breast examination. Several research groups are therefore aiming to produce dedicated systems, known as positron-emission mammographs (PEM), that have a resolution better than 2 mm.

One of these groups is the Crystal Clear collaboration (CCC), which is developing a system called ClearPEM. Founded in 1990 as project RD-18 within CERN’s Detector Research and Development Committee’s programme, the CCC aimed at R&D on fast, radiation-hard scintillating crystals for calorimetry at the LHC (Lecoq 1991). In this context, the collaboration contributed to the successful development of the lead tungstate (PbWO4) crystals now used in the electromagnetic calorimeters in the CMS and ALICE experiments at the LHC (Breskin and Voss 2009).

The CCC has transferred its knowledge to medical applications

Building on this experience, the CCC has transferred its knowledge to medical applications – initially through the development of a preclinical scanner for small animals, the ClearPET (Auffray et al. 2004. Indeed, the technical requirements for PET are close to those of applications in high-energy physics. Both require fast scintillators with high light-output and good energy resolution. They need compact and efficient photodetectors that are read by highly integrated, low-noise electronics that can treat the signals from thousands of channels. The CCC also has expertise in co-ordinating an international collaboration to develop leading-edge scientific devices.

Recently, the collaboration has used the experience gained with ClearPET to develop a dedicated PET system for human medicine – the ClearPEM, shown in figure 1 (Lecoq and Varela 2002). The breast was chosen as a target organ because of the benefits related to precise diagnosis of breast cancer. With the ClearPEM, the patient lies in a prone position on a bed designed such that the breast hangs through a hole. A robot moves the bed into position over two parallel detector-plates that rotate around the breast to acquire a full 3D image. In addition, ClearPEM also performs examinations of the armpit – the axilla – by rotating its detector arm by 90 degrees, thereby shifting the plates to be on each side of it.

ClearPEM crystal matrices

Each detector plate contains 96 detector matrices, where one matrix consists of an 8 × 4 array of cerium-doped lutetium-yttrium silicate (LYSO:Ce) crystals, each 2 × 2 × 20 mm3 in size. As figure 2 shows, each crystal matrix is coupled to two 8 × 4 arrays of Hamamatsu S8550 avalanche photodiode (APD) arrays, such that every 2 × 2 mm2 read-out face is coupled to a dedicated APD. This configuration allows the depth of interaction (DOI) in the crystals to be measured and reduces the parallax error of the lines of response, contributing to better spatial resolution in the reconstructed image. The DOI can be measured with an uncertainty of around 2 mm on the exact position of the γ interaction in the crystal. Each signal channel is coupled to one input of a dedicated 192-channel ASIC, developed by the Portuguese Laboratory for Particle Physics and Instrumentation (LIP). It provides front-end treatment of the signal before handing it over to a 10-bit sampling ADC for digitalization (Varela et al. 2007). The image is reconstructed with a dedicated iterative algorithm.

Two ClearPEM prototypes have been built. The first is currently installed at the Instituto de Ciências Nucleares Aplicadas à Saúde in Coimbra, Portugal. The second, installed at Hôpital Nord in Marseilles, France, is used for ClearPEM-Sonic, a project within the European Centre for Research in Medical Imaging (CERIMED) initiative. While ClearPEM provides high-resolution metabolic information, it lacks anatomical details. ClearPEM-Sonic, however, extends the second prototype with an ultrasound elastography device, which images strain in soft tissue (Frisch 2011). The aim is to provide multimodal information that reveals the exact location of potential lesions in the surrounding anatomy. The availability of elastographic information further improves the specificity of the examination by identifying non-cancerous diseases – such as benign inflammatory diseases of the breast – that exhibit higher uptake of the radioactive tracer, fluorodeoxyglucose (18F), or FDG, used in PET imaging.

The French authority has approved ClearPEM-Sonic for a first clinical trial on 20 patients

Both prototypes have been tested extensively. The electronic noise level is under 2%, with an interchannel noise dispersion of below 8%. The front-end trigger accepts signals at a rate of 2.5 MHz, while the overall acquisition rate reaches 0.8 MHz. The detector has been properly calibrated and gives an energy resolution of 14.6% FWHM for 511 keV photons, which allows for efficient rejection of photons that have lost energy during a scattering process. The coincidence-time resolution of 4.6 ns FWHM reduces the number of random coincidences. The global detection efficiency in the centre of the plates has been determined to be 1.5% at a plate distance of 100 mm. The image resolution measured with a dedicated Jasczcak phantom is 1.3 mm.

The competent French authority has approved ClearPEM-Sonic for a first clinical trial on 20 patients. The goal of this trial is to study the feasibility and safety of PEM examinations. In parallel, the results of ClearPEM are being compared with other modalities, such as classical B-mode ultrasound, X-ray mammography, whole-body combined PET and computerized tomography (PET/CT) imaging and MRI, which all patients participating in this trial will have undergone. The ClearPEM image is acquired immediately after the whole-body PET/CT, which avoids the need for a second injection of FDG for the patient. The histological assessment of the biopsy is used as the gold standard.

Images of scans

The sample case study shown in figure 3 is a patient who was diagnosed with multifocal breast cancer during the initial examination. The whole-body PET/CT reveals a first lesion in the left breast and a second close to the axilla. Before deciding on the best therapy, it was crucial to find out whether the cancer had spread to the whole breast or was still confined to two individual lesions. An extended examination with MRI shows small lesions around the first one. The whole-body PET/CT image, however, does not show any small lesions. The standard procedure is to obtain biopsy samples of the suspicious tissue. However, the availability of a high-resolution PET can give the same information. Indeed, when the patient was imaged with ClearPEM, the lesions visible with MRI were confirmed to be metabolically hyperactive, i.e. potentially cancerous. The biopsy subsequently conducted confirmed this indication. This clinical case study, together with several others, hints at how ClearPEM could improve the diagnostic process.

This project successfully demonstrates the value of fundamental research in high-energy physics in applications to wider society. The knowledge gained by an international collaboration in the development of particle detectors for the LHC has been put to use in the construction of a new medical device – a dedicated breast PET scanner, ClearPEM. It provides excellent image resolution that allows the detection of small lesions. Its high detection efficiency allows a reduction in the total examination time and in the amount of radioactive tracer that has to be injected. Last, first clinical results hint at the medical value of this device.

• The members of the ClearPEM-Sonic collaboration are: CERN; the University of Aix-Marseille; the Vrije Universiteit Brussels; the Portuguese Laboratory for Particle Physics and Instrumentation, Lisbon; the Laboratoire de Mecanique et Acoustique, Marseille; the University Milano-Biccoca; PETsys, Lisbon; SuperSonic Imagine, Aix-en-Provence; AssistancePublique – Hôpitaux de Marseille; and the Institut Paoli Calmettes, Marseille.

The post ClearPEM clarifies breast cancer diagnosis appeared first on CERN Courier.

]]>
https://cerncourier.com/a/clearpem-clarifies-breast-cancer-diagnosis/feed/ 0 Feature Knowledge gained in developing particle detectors for the LHC has been used to create a dedicated PET device for breast scans. https://cerncourier.com/wp-content/uploads/2016/10/CCcry6_09_16.jpg