Culture and history Archives – CERN Courier https://cerncourier.com/c/culture-history/ Reporting on international high-energy physics Wed, 09 Jul 2025 07:11:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://cerncourier.com/wp-content/uploads/2025/03/cropped-favicon-32x32.png Culture and history Archives – CERN Courier https://cerncourier.com/c/culture-history/ 32 32 Four ways to interpret quantum mechanics https://cerncourier.com/a/four-ways-to-interpret-quantum-mechanics/ Wed, 09 Jul 2025 07:11:50 +0000 https://cerncourier.com/?p=113474 Carlo Rovelli describes the major schools of thought on how to make sense of a purely quantum world.

The post Four ways to interpret quantum mechanics appeared first on CERN Courier.

]]>
One hundred years after its birth, quantum mechanics is the foundation of our understanding of the physical world. Yet debates on how to interpret the theory – especially the thorny question of what happens when we make a measurement – remain as lively today as during the 1930s.

The latest recognition of the fertility of studying the interpretation of quantum mechanics was the award of the 2022 Nobel Prize in Physics to Alain Aspect, John Clauser and Anton Zeilinger. The motivation for the prize pointed out that the bubbling field of quantum information, with its numerous current and potential technological applications, largely stems from the work of John Bell at CERN the 1960s and 1970s, which in turn was motivated by the debate on the interpretation of quantum mechanics.

The majority of scientists use a textbook formulation of the theory that distinguishes the quantum system being studied from “the rest of the world” – including the measuring apparatus and the experimenter, all described in classical terms. Used in this orthodox manner, quantum theory describes how quantum systems react when probed by the rest of the world. It works flawlessly.

Sense and sensibility

The problem is that the rest of the world is quantum mechanical as well. There are of course regimes in which the behaviour of a quantum system is well approximated by classical mechanics. One may even be tempted to think that this suffices to solve the difficulty. But this leaves us in the awkward position of having a general theory of the world that only makes sense under special approximate conditions. Can we make sense of the theory in general?

Today, variants of four main ideas stand at the forefront of efforts to make quantum mechanics more conceptually robust. They are known as physical collapse, hidden variables, many worlds and relational quantum mechanics. Each appears to me to be viable a priori, but each comes with a conceptual price to pay. The latter two may be of particular interest to the high-energy community as the first two do not appear to fit well with relativity.

Probing physical collapse

The idea of the physical collapse is simple: we are missing a piece of the dynamics. There may exist a yet-undiscovered physical interaction that causes the wavefunction to “collapse” when the quantum system interacts with the classical world in a measurement. The idea is empirically testable. So far, all laboratory attempts to find violations of the textbook Schrödinger equation have failed (see “Probing physical collapse” figure), and some models for these hypothetical new dynamics have been ruled out by measurements.

The second possibility, hidden variables, follows on from Einstein’s belief that quantum mechanics is incomplete. It posits that its predictions are exactly correct, but that there are additional variables describing what is going on, besides those in the usual formulation of the theory: the reason why quantum predictions are probabilistic is our ignorance of these other variables.

The work of John Bell shows that the dynamics of any such theory will have some degree of non-locality (see “Non-locality” image). In the non-relativistic domain, there is a good example of a theory of this sort, that goes under the name of de Broglie–Bohm, or pilot-wave theory. This theory has non-local but deterministic dynamics capable of reproducing the predictions of non-relativistic quantum-particle dynamics. As far as I am aware, all existing theories of this kind break Lorentz invariance, and the extension of hidden variable theories to quantum-field theoretical domains appears cumbersome.

Relativistic interpretations

Let me now come to the two ideas that are naturally closer to relativistic physics. The first is the many-worlds interpretation – a way of making sense of quantum theory without either changing its dynamics or adding extra variables. It is described in detail in this edition of CERN Courier by one of its leading contemporary proponents (see “The minimalism of many worlds“), but the main idea is the following: being a genuine quantum system, the apparatus that makes a quantum measurement does not collapse the superposition of possible measurement outcomes – it becomes a quantum superposition of the possibilities, as does any human observer.

Non-locality

If we observe a singular outcome, says the many-worlds interpretation, it is not because one of the probabilistic alternatives has actualised in a mysterious “quantum measurement”. Rather, it is because we have split into a quantum superposition of ourselves, and we just happen to be in one of the resulting copies. The world we see around us is thus only one of the branches of a forest of parallel worlds in the overall quantum state of everything. The price to pay to make sense of quantum theory in this manner is to accept the idea that the reality we see is just a branch in a vast collection of possible worlds that include innumerable copies of ourselves.

Relational interpretations are the most recent of the four kinds mentioned. They similarly avoid physical collapse or hidden variables, but do so without multiplying worlds. They stay closer to the orthodox textbook interpretation, but with no privileged status for observers. The idea is to think of quantum theory in a manner closer to the way it was initially conceived by Born, Jordan, Heisenberg and Dirac: namely in terms of transition amplitudes between observations rather than quantum states evolving continuously in time, as emphasised by Schrödinger’s wave mechanics (see “A matter of taste” image).

Observer relativity

The alternative to taking the quantum state as the fundamental entity of the theory is to focus on the information that an arbitrary system can have about another arbitrary system. This information is embodied in the physics of the apparatus: the position of its pointer variable, the trace in a bubble chamber, a person’s memory or a scientist’s logbook. After a measurement, these physical quantities “have information” about the measured system as their value is correlated with a property of the observed systems.

Quantum theory can be interpreted as describing the relative information that systems can have about one another. The quantum state is interpreted as a way of coding the information about a system available to another system. What looks like a multiplicity of worlds in the many-worlds interpretation becomes nothing more than a mathematical accounting of possibilities and probabilities.

A matter of taste

The relational interpretation reduces the content of the physical theory to be about how systems affect other systems. This is like the orthodox textbook interpretation, but made democratic. Instead of a preferred classical world, any system can play a role that is a generalisation of the Copenhagen observer. Relativity teaches us that velocity is a relative concept: an object has no velocity by itself, but only relative to another object. Similarly, quantum mechanics, interpreted in this manner, teaches us that all physical variables are relative. They are not properties of a single object, but ways in which an object affects another object.

The QBism version of the interpretation restricts its attention to observing systems that are rational agents: they can use observations and make probabilistic predictions about the future. Probability is interpreted subjectively, as the expectation of a rational agent. The relational interpretation proper does not accept this restriction: it considers the information that any system can have about any other system. Here, “information” is understood in the simple physical sense of correlation described above.

Like many worlds – to which it is not unrelated – the relational interpretation does not add new dynamics or new variables. Unlike many worlds, it does not ask us to think about parallel worlds either. The conceptual price to pay is a radical weakening of a strong form of realism: the theory does not give us a picture of a unique objective sequence of facts, but only perspectives on the reality of physical systems, and how these perspectives interact with one another. Only quantum states of a system relative to another system play a role in this interpretation. The many-worlds interpretation is very close to this. It supplements the relational interpretation with an overall quantum state, interpreted realistically, achieving a stronger version of realism at the price of multiplying worlds. In this sense, the many worlds and relational interpretations can be seen as two sides of the same coin.

Every theoretical physicist who is any good knows six or seven different theoretical representations for exactly the same physics

I have only sketched here the most discussed alternatives, and have tried to be as neutral as possible in a field of lively debates in which I have my own strong bias (towards the fourth solution). Empirical testing, as I have mentioned, can only test the physical collapse hypothesis.

There is nothing wrong, in science, in using different pictures for the same phenomenon. Conceptual flexibility is itself a resource. Specific interpretations often turn out to be well adapted to specific problems. In quantum optics it is sometimes convenient to think that there is a wave undergoing interference, as well as a particle that follows a single trajectory guided by the wave, as in the pilot-wave hidden-variable theory. In quantum computing, it is convenient to think that different calculations are being performed in parallel in different worlds. My own field of loop quantum gravity treats spacetime regions as quantum processes: here, the relational interpretation merges very naturally with general relativity, because spacetime regions themselves become quantum processes, affecting each other.

Richard Feynman famously wrote that “every theoretical physicist who is any good knows six or seven different theoretical representations for exactly the same physics. He knows that they are all equivalent, and that nobody is ever going to be able to decide which one is right at that level, but he keeps them in his head, hoping that they will give him different ideas for guessing.” I think that this is where we are, in trying to make sense of our best physical theory. We have various ways to make sense of it. We do not yet know which of these will turn out to be the most fruitful in the future.

The post Four ways to interpret quantum mechanics appeared first on CERN Courier.

]]>
Feature Carlo Rovelli describes the major schools of thought on how to make sense of a purely quantum world. https://cerncourier.com/wp-content/uploads/2025/07/CCJulAug25_INTERP_Helgoland.jpg
Quantum theory returns to Helgoland https://cerncourier.com/a/quantum-theory-returns-to-helgoland/ Tue, 08 Jul 2025 20:01:35 +0000 https://cerncourier.com/?p=113617 The takeaway from Helgoland 2025 was that the foundations of quantum mechanics, though strongly built on Helgoland 100 years ago, remain open to interpretation.

The post Quantum theory returns to Helgoland appeared first on CERN Courier.

]]>
In June 1925, Werner Heisenberg retreated to the German island of Helgoland seeking relief from hay fever and the conceptual disarray of the old quantum theory. On this remote, rocky outpost in the North Sea, he laid the foundations of matrix mechanics. Later, his “island epiphany” would pass through the hands of Max Born, Wolfgang Pauli, Pascual Jordan and several others, and become the first mature formulation of quantum theory. From 9 to 14 June 2025, almost a century later, hundreds of researchers gathered on Helgoland to mark the anniversary – and to deal with pressing and unfinished business.

Alfred D Stone (Yale University) called upon participants to challenge the folklore surrounding quantum theory’s birth. Philosopher Elise Crull (City College of New York) drew overdue attention to Grete Hermann, who hinted at entanglement before it had a name and anticipated Bell in identifying a flaw in von Neumann’s no-go theorem, which had been taken as proof that hidden-variable theories are impossible. Science writer Philip Ball questioned Heisenberg’s epiphany itself: he didn’t invent matrix mechanics in a flash, claims Ball, nor immediately grasp its relevance, and it took months, and others, to see his contribution for what it was (see “Lend me your ears” image).

Building on a strong base

A clear takeaway from Helgoland 2025 was that the foundations of quantum mechanics, though strongly built on Helgoland 100 years ago, nevertheless remain open to interpretation, and any future progress will depend on excavating them directly (see “Four ways to interpret quantum mechanics“).

Does the quantum wavefunction represent an objective element of reality or merely an observer’s state of knowledge? On this question, Helgoland 2025 could scarcely have been more diverse. Christopher Fuchs (UMass Boston) passionately defended quantum Bayesianism, which recasts the Born probability rule as a consistency condition for rational agents updating their beliefs. Wojciech Zurek (Los Alamos National Laboratory) presented the Darwinist perspective, for which classical objectivity emerges from redundant quantum information encoded across the environment. Although Zurek himself maintains a more agnostic stance, his decoherence-based framework is now widely embraced by proponents of many-worlds quantum mechanics (see “The minimalism of many worlds“).

The foundations of quantum mechanics remain open to interpretation, and any future progress will depend on excavating them directly

Markus Aspelmeyer (University of Vienna) made the case that a signature of gravity’s long-speculated quantum nature may soon be within experimental reach. Building on the “gravitational Schrödinger’s cat” thought experiment proposed by Feynman in the 1950s, he described how placing a massive object in a spatial superposition could entangle a nearby test mass through their gravitational interaction. Such a scenario would produce correlations that are inexplicable by classical general relativity alone, offering direct empirical evidence that gravity must be described quantum-mechanically. Realising this type of experiment requires ultra-low pressures and cryogenic temperatures to suppress decoherence, alongside extremely low-noise measurements of gravitational effects at short distances. Recent advances in optical and opto­mechanical techniques for levitating and controlling nanoparticles suggest a path forward – one that could bring evidence for quantum gravity not from black holes or the early universe, but from laboratories on Earth.

Information insights

Quantum information was never far from the conversation. Isaac Chuang (MIT) offered a reconstruction of how Heisenberg might have arrived at the principles of quantum information, had his inspiration come from Shannon’s Mathematical Theory of Communication. He recast his original insights into three broad principles: observations act on systems; local and global perspectives are in tension; and the order of measurements matters. Starting from these ingredients, one could in principle recover the structure of the qubit and the foundations of quantum computation. Taking the analogy one step further, he suggested that similar tensions between memorisation and generalisation – or robustness and adaptability – may one day give rise to a quantum theory of learning.

Helgoland 2025 illustrated just how much quantum mechanics has diversified since its early days. No longer just a framework for explaining atomic spectra, the photoelectric effect and black-body radiation, it is at once a formalism describing high-energy particle scattering, a handbook for controlling the most exotic states of matter, the foundation for information technologies now driving national investment plans, and a source of philosophical conundrums that, after decades at the margins, has once again taken centre stage in theoretical physics.

The post Quantum theory returns to Helgoland appeared first on CERN Courier.

]]>
Meeting report The takeaway from Helgoland 2025 was that the foundations of quantum mechanics, though strongly built on Helgoland 100 years ago, remain open to interpretation. https://cerncourier.com/wp-content/uploads/2025/07/CCJulAug25_FN_bornpauli.jpg
Quantum culture https://cerncourier.com/a/quantum-culture/ Tue, 08 Jul 2025 19:24:12 +0000 https://cerncourier.com/?p=113653 Kanta Dihal explores why quantum mechanics captures the imagination of writers – and how ‘quantum culture’ affects the public understanding of science.

The post Quantum culture appeared first on CERN Courier.

]]>
Kanta Dihal

How has quantum mechanics influenced culture in the last 100 years?

Quantum physics offers an opportunity to make the impossible seem plausible. For instance, if your superhero dies dramatically but the actor is still on the payroll, you have a few options available. You could pretend the hero miraculously survived the calamity of the previous instalment. You could also pretend the events of the previous instalment never happened. And then there is Star Wars: “Somehow, Palpatine returned.”

These days, however, quantum physics tends to come to the rescue. Because quantum physics offers the wonderful option to maintain that all previous events really happened, and yet your hero is still alive… in a parallel universe. Much is down to the remarkable cultural impact of the many-worlds interpretation of quantum physics, which has been steadily growing in fame (or notoriety) since Hugh Everett introduced it
in 1957.

Is quantum physics unique in helping fiction authors make the impossible seem possible?

Not really! Before the “quantum” handwave, there was “nuclear”: think of Dr Atomic from Watchmen, or Godzilla, as expressions of the utopian and dystopian expectations of that newly discovered branch of science. Before nuclear, there was electricity, with Frankenstein’s monster as perhaps its most important product. We can go all the way back to the invention of hydraulics in the ancient world, which led to an explosion of tales of liquid-operated automata – early forms of artificial intelligence – such as the bronze soldier Talos in ancient Greece. We have always used our latest discoveries to dream of a future in which our ancient tales of wonder could come true.

Is the many-worlds interpretation the most common theory used in science fiction inspired by quantum mechanics?

Many-worlds has become Marvel’s favourite trope. It allows them to expand on an increasingly entangled web of storylines that borrow from a range of remakes and reboots, as well as introducing gender and racial diversity into old stories. Marvel may have mainstreamed this interpretation, but the viewers of the average blockbuster may not realise exactly how niche it is, and how many alternatives there are. With many interpretations vying for acceptance, every once in a while a brave social scientist ventures to survey quantum-physicists’ preferences. These studies tend to confirm the dominance of the Copenhagen interpretation, with its collapse of the wavefunction rather than the branching universes characteristic of the Everett interpretation. In a 2016 study, for instance, only 6% of quantum physicists claimed that Everett was their favourite interpretation. In 2018 I looked through a stack of popular quantum-physics books published between 1980 and 2017, and found that more than half of these books endorse the many-worlds interpretation. A non-physicist might be forgiven for thinking that quantum physicists are split between two equal-sized enemy camps of Copenhagenists and Everettians.

What makes the many-worlds interpretation so compelling?

Answering this brings us to a fundamental question that fiction has enjoyed exploring since humans first told each other stories: what if? “What if the Nazis won the Second World War?” is pretty much an entire genre by itself these days. Before that, there were alternate histories of the American Civil War and many other key historical events. This means that the many-worlds interpretation fits smoothly into an existing narrative genre. It suggests that these alternate histories may be real, that they are potentially accessible to us and simply happening in a different dimension. Even the specific idea of branching alternative universes existed in fiction before Hugh Everett applied it to quantum mechanics. One famous example is the 1941 short story The Garden of Forking Paths by the Argentinian writer Jorge Luis Borges, in which a writer tries to create a novel in which everything that could happen, happens. His story anticipated the many-worlds interpretation so closely that Bryce DeWitt used an extract from it as the epigraph to his 1973 edited collection The Many-Worlds Interpretation of Quantum Mechanics. But the most uncanny example is, perhaps, Andre Norton’s science-fiction novel The Crossroads of Time, from 1956 – published when Everett was writing his thesis. In her novel, a group of historians invents a “possibility worlds” theory of history. The protagonist, Blake Walker, discovers that this theory is true when he meets a group of men from a parallel universe who are on the hunt for a universe-travelling criminal. Travelling with them, Blake ends up in a world where Hitler won the Battle of Britain. Of course, in fiction, only worlds in which a significant change has taken place are of any real interest to the reader or viewer. (Blake also visits a world inhabited by metal dinosaurs.) The truly uncountable number of slightly different universes Everett’s theory implies are extremely difficult to get our heads around. Nonetheless, our storytelling mindsets have long primed us for a fascination with the many-worlds interpretation.

Have writers put other interpretations to good use?

For someone who really wants to put their physics degree to use in their spare time, I’d recommend the works of Greg Egan: although his novel Quarantine uses the controversial conscious collapse interpretation, he always ensures that the maths checks out. Egan’s attitude towards the scientific content of his novels is best summed up by a quote on his blog: “A few reviewers complained that they had trouble keeping straight [the science of his novel Incandescence]. This leaves me wondering if they’ve really never encountered a book that benefits from being read with a pad of paper and a pen beside it, or whether they’re just so hung up on the idea that only non-fiction should be accompanied by note-taking and diagram-scribbling that it never even occurred to them to do this.”

What other quantum concepts are widely used and abused?

We have Albert Einstein to thank for the extremely evocative description of quantum entanglement as “spooky action at a distance”. As with most scientific phenomena, a catchy nickname such as this one is extremely effective for getting a concept to stick in the popular imagination. While Einstein himself did not initially believe quantum entanglement could be a real phenomenon, as it would violate local causality, we now have both evidence and applications of entanglement in the real world, most notably in quantum cryptography. But in science fiction, the most common application of quantum entanglement is in faster-than-light communication. In her 1966 novel Rocannon’s World, Ursula K Le Guin describes a device called the “ansible”, which interstellar travellers use to instantaneously communicate with each other across vast distances. Her term was so influential that it now regularly appears in science fiction as a widely accepted name for a faster-than-light communications device, the same way we have adopted the word “robot” from the 1920 play R.U.R. by Karel Čapek.

Fiction may get the science wrong, but that is often because the story it tries to tell existed long before the science

How were cultural interpretations of entanglement influenced by the development of quantum theory?

It wasn’t until the 1970s that no-signalling theorems conclusively proved that entanglement correlations, while instantaneous, cannot be controlled or used to send messages. Explaining why is a lot more complex than communicating the notion that observing a particle here has an effect on a particle there. Once again, quantum physics seemingly provides just enough scientific justification to resolve an issue that has plagued science fiction ever since the speed of light was discovered: how can we travel through space, exploring galaxies, settling on distant planets, if we cannot communicate with each other? This same line of thought has sparked another entanglement-related invention in fiction: what if we can send not just messages but also people, or even entire spaceships, across faster-than-light distances using entanglement? Conveniently, quantum physicists had come up with another extremely evocative term that fit this idea perfectly: quantum teleportation. Real quantum teleportation only transfers information. But the idea of teleportation is so deeply embedded in our storytelling past that we can’t help extrapolating it. From stories of gods that could appear anywhere at will to tales of portals that lead to strange new worlds, we have always felt limited by the speeds of travel we have managed to achieve – and once again, the speed of light seems to be a hard limit that quantum teleportation might be able to get us around. In his 2003 novel Timeline, Michael Crichton sends a group of researchers back in time using quantum teleportation, and the videogame Half-Life 2 contains teleportation devices that similarly seem to work through quantum entanglement.

What quantum concepts have unexplored cultural potential?

Clearly, interpretations other than many worlds have a PR problem, so is anyone willing to write a chart topper based on the relational interpretation or QBism? More generally, I think that any question we do not yet have an answer to, or any theory that remains untestable, is a potential source for an excellent story. Richard Feynman famously said, “I think I can safely say that nobody understands quantum mechanics.” Ironically, it is precisely because of this that quantum physics has become such a widespread building block of science fiction: it is just hard enough to understand, just unresolved and unexplained enough to keep our hopes up that one day we might discover that interstellar communication or inter-universe travel might be possible. Few people would choose the realities of theorising over these ancient dreams. That said, the theorising may never have happened without the dreams. How many of your colleagues are intimately acquainted with the very science fiction they criticise for having unrealistic physics? We are creatures of habit and convenience held together by stories, physicists no less than everyone else. This is why we come up with catchy names for theories, and stories about dead-and-alive cats. Fiction may often get the science wrong, but that is often because the story it tries to tell existed long before the science.

The post Quantum culture appeared first on CERN Courier.

]]>
Opinion Kanta Dihal explores why quantum mechanics captures the imagination of writers – and how ‘quantum culture’ affects the public understanding of science. https://cerncourier.com/wp-content/uploads/2025/07/CCJulAug25_INT_dihal_feature.jpg
Encounters with artists https://cerncourier.com/a/encounters-with-artists/ Wed, 26 Mar 2025 13:45:55 +0000 https://cerncourier.com/?p=112793 Over the past 10 years, Mónica Bello facilitated hundreds of encounters between artists and scientists as curator of the Arts at CERN programme.

The post Encounters with artists appeared first on CERN Courier.

]]>
Why should scientists care about art?

Throughout my experiences in the laboratory, I have seen how art is an important part of a scientist’s life. By being connected with art, scientists recognise that their activities are very embedded in contemporary culture. Science is culture. Through art and dialogues with artists, people realise how important science is for society and for culture in general. Science is an important cultural pillar in our society, and these interactions bring scientists meaning.

Are science and art two separate cultures?

Today, if you ask anyone: “What is nature?” they describe everything in scientific terms. The way you describe things, the mysteries of your research: you are actually answering the questions that are present in everyone’s life. In this case, scientists have a sense of responsibility. I think art helps to open this dialogue from science into society.

Do scientists have a responsibility to communicate their research?

All of us have a social responsibility in everything we produce. Ideas don’t belong to anyone, so it’s a collective endeavour. I think that scientists don’t have the responsibility to communicate the research themselves, but that their research cannot be isolated from society. I think it’s a very joyful experience to see that someone cares about what you do.

Why should artists care about science?

If you go to any academic institution, there’s always a scientific component, very often also a technological one. A scientific aspect of your life is always present. This is happening because we’re all on the same course. It’s a consequence of this presence of science in our culture. Artists have an important role in our society, and they help to spark conversations that are important to everyone. Sometimes it might seem as though they are coming from a very individual lens, but in fact they have a very large reach and impact. Not immediately, not something that you can count with data, but there is definitely an impact. Artists open these channels for communicating and thinking about a particular aspect of science, which is difficult to see from a scientific perspective. Because in any discipline, it’s amazing to see your activity from the eyes of others.

Creativity and curiosity are the parameters and competencies that make up artists and scientists

A few years back we did a little survey, and most of the scientists thought that by spending time with artists, they took a step back to think about their research from a different lens, and this changed their perspective. They thought of this as a very positive experience. So I think art is not only about communicating to the public, but about exploring the personal synergies of art and science. This is why artists are so important.

Do experimental and theoretical physicists have different attitudes towards art?

Typically, we think that theorists are much more open to artists, but I don’t agree. In my experiences at CERN, I found many engineers and experimental physicists being highly theoretical. Both value artistic perspectives and their ability to consider questions and scientific ideas in an unconventional way. Experimental physicists would emphasise engagement with instruments and data, while theoretical physicists would focus on conceptual abstraction.

By being with artists, many experimentalists feel that they have the opportunity to talk about things beyond their research. For example, we often talk about the “frontiers of knowledge”. When asked about this, experimentalists or theoretical physicists might tell us about something other than particle physics – like neuroscience, or the brain and consciousness. A scientist is a scientist. They are very curious about everything.

Do these interactions help to blur the distinction between art and science?

Well, here I’m a bit radical because I know that creativity is something we define. Creativity and curiosity are the parameters and competencies that make up artists and scientists. But to become a scientist or an artist you need years of training – it’s not that you can become one just because you are a curious and creative person.

Chroma VII work of art

Not many people can chat about particle physics, but scientists very often chat with artists. I saw artists speaking for hours with scientists about the Higgs field. When you see two people speaking about the same thing, but with different registers, knowledge and background, it’s a precious moment.

When facilitating these discussions between physicists and artists, we don’t speak only about physics, but about everything that worries them. Through that, grows a sort of intimacy that often becomes something else: a friendship. This is the point at which a scientist stops being an information point for an artist and becomes someone who deals with big questions alongside an artist – who is also a very knowledgeable and curious person. This is a process rich in contrast, and you get many interesting surprises out of these interactions.

But even in this moment, they are still artists and scientists. They don’t become this blurred figure that can do anything.

Can scientific discovery exist without art?

That’s a very tricky question. I think that art is a component of science, therefore science cannot exist without art – without the qualities that the artist and scientist have in common. To advance science, you have to create a question that needs to be answered experimentally.

Did discoveries in quantum mechanics affect the arts?

Everything is subjected to quantum mechanics. Maybe what it changed was an attitude towards uncertainty: what we see and what we think is there. There was an increased sense of doubt and general uncertainty in the arts.

Do art and science evolve together or separately?

I think there have been moments of convergence – you can clearly see it in any of the avant garde. The same applies to literature; for example, modernist writers showed a keen interest in science. Poets such as T S Eliot approached poetry with a clear resonance of the first scientific revolutions of the century. There are references to the contributions of Faraday, Maxwell and Planck. You can tell these artists and poets were informed and eager to follow what science was revealing about the world.

You can also note the influence of science in music, as physicists get a better understanding of the physical aspects of sound and matter. Physics became less about viewing the world through a lens, and instead focused on the invisible: the vibrations of matter, electricity, the innermost components of materials. At the end of the 19th and 20th centuries, these examples crop up constantly. It’s not just representing the world as you see it through a particular lens, but being involved in the phenomena of the world and these uncensored realities.

From the 1950s to the 1970s you can see these connections in every single moment. Science is very present in the work of artists, but my feeling is that we don’t have enough literature about it. We really need to conduct more research on this connection between humanities and science.

What are your favourite examples of art influencing science?

Feynman diagrams are one example. Feynman was amazing – a prodigy. Many people before him tried to represent things that escaped our intuition visually and failed. We also have the Pauli Archives here at CERN. Pauli was not the most popular father of quantum mechanics, but he was determined to not only understand mathematical equations but to visualise them, and share them with his friends and colleagues. This sort of endeavour goes beyond just writing – it is about the possibility of creating a tangible experience. I think scientists do that all the time by building machines, and then by trying to understand these machines statistically. I see that in the laboratory constantly, and it’s very revealing because usually people might think of these statistics as something no one cares about – that the visuals are clumsy and nerdy. But they’re not.

Even Leonardo da Vinci was known as a scientist and an artist, but his anatomical sketches were not discovered until hundreds of years after his other works. Newton was also paranoid about expressing his true scientific theories because of the social standards and politics of the time. His views were unorthodox, and he did not want to ruin his prestigious reputation.

Today’s culture also influences how we interpret history. We often think of Aristotle as a philosopher, yet he is also recognised for contributions to natural history. The same with Democritus, whose ideas laid foundations for scientific thought.

So I think that opening laboratories to artists is very revealing about the influence of today’s culture on science.

When did natural philosophy branch out into art and science?

I believe it was during the development of the scientific method: observation, analysis and the evolution of objectivity. The departure point was definitely when we developed a need to be objective. It took centuries to get where we are now, but I think there is a clear division: a line with philosophy, natural philosophy and natural history on one side, and modern science on the other. Today, I think art and science have different purposes. They convene at different moments, but there is always this detour. Some artists are very scientific minded, and some others are more abstract, but they are both bound to speculate massively.

Its really good news for everyone that labs want to include non-scientists

For example, at our Arts at CERN programme we have had artists who were interested in niche scientific aspects. Erich Berger, an artist from Finland, was interested in designing a detector, and scientists whom he met kept telling him that he would need to calibrate the detector. The scientist and the artist here had different goals. For the scientist, the most important thing is that the detector has precision in the greatest complexity. And for the artist, it’s not. It’s about the process of creation, not the analysis.

Do you think that science is purely an objective medium while art is a subjective one?

No. It’s difficult to define subjectivity and objectivity. But art can be very objective. Artists create artefacts to convey their intended message. It’s not that these creations are standing alone without purpose. No, we are beyond that. Now art seeks meaning that is, in this context, grounded in scientific and technological expertise.

How do you see the future of art and science evolving?

There are financial threats to both disciplines. We are still in this moment where things look a bit bleak. But I think our programme is pioneering, because many scientific labs are developing their own arts programmes inspired by the example of Arts at CERN. This is really great, because unless you are in a laboratory, you don’t see what doing science is really about. We usually read science in the newspapers or listen to it on a podcast – everything is very much oriented to the communication of science, but making science is something very specific. It’s really good news for everyone that laboratories want to include non-scientists. Arts at CERN works mostly with visual artists, but you could imagine filmmakers, philosophers, those from the humanities, poets or almost anyone at all, depending on the model that one wants to create in the lab.

The post Encounters with artists appeared first on CERN Courier.

]]>
Opinion Over the past 10 years, Mónica Bello facilitated hundreds of encounters between artists and scientists as curator of the Arts at CERN programme. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_INT_Bello.jpg
Edoardo Amaldi and the birth of Big Science https://cerncourier.com/a/edoardo-amaldi-and-the-birth-of-big-science/ Mon, 24 Mar 2025 08:45:02 +0000 https://cerncourier.com/?p=112656 In an interview drawing on memories from childhood and throughout his own distinguished career at CERN, Ugo Amaldi offers deeply personal insights into his father Edoardo’s foundational contributions to international cooperation in science.

The post Edoardo Amaldi and the birth of Big Science appeared first on CERN Courier.

]]>
Ugo Amaldi beside a portrait of his father Edoardo

Should we start with your father’s involvement in the founding of CERN?

I began hearing my father talk about a new European laboratory while I was still in high school in Rome. Our lunch table was always alive with discussions about science, physics and the vision of this new laboratory. Later, I learned that between 1948 and 1949, my father was deeply engaged in these conversations with two of his friends: Gilberto Bernardini, a well-known cosmic-ray expert, and Bruno Ferretti, a professor of theoretical physics at Rome University. I was 15 years old and those table discussions remain vivid in my memory.

So, the idea of a European laboratory was already being discussed before the 1950 UNESCO meeting?

Yes, indeed. Several eminent European physicists, including my father, Pierre Auger, Lew Kowarski and Francis Perrin, recognised that Europe could only be competitive in nuclear physics through collaborative efforts. All the actors wanted to create a research centre that would stop the post-war exodus of physics talent to North America and help rebuild European science. I now know that my father’s involvement began in 1946 when he travelled to Cambridge, Massachusetts, for a conference. There, he met Nobel Prize winner John Cockcroft, and their conversations planted in his mind the first seeds for a European laboratory.

Parallel to scientific discussions, there was an important political initiative led by Swiss philosopher and writer Denis de Rougemont. After spending the war years at Princeton University, he returned to Europe with a vision of fostering unity and peace. He established the Institute of European Culture in Lausanne, Switzerland, where politicians from France, Britain and Germany would meet. In December 1949, during the European Cultural Conference in Lausanne, French Nobel Prize winner Louis de Broglie sent a letter advocating for a European laboratory where scientists from across the continent could work together peacefully.

The Amaldi family in 1948

My father strongly believed in the importance of accelerators to advance the new field that, at the time, was at the crossroads between nuclear physics and cosmic-ray physics. Before the war, in 1936, he had travelled to Berkeley to learn about cyclotrons from Ernest Lawrence. He even attempted to build a cyclotron in Italy in 1942, profiting from the World’s Fair that had to be held in Rome. Moreover, he was deeply affected by the exodus of talented Italian physicists after the war, including Bruno Rossi, Gian Carlo Wick and Giuseppe Cocconi. He saw CERN as a way to bring these scientists back and rebuild European physics.

How did Isidor Rabi’s involvement come into play?

In 1950 my father was corresponding with Gilberto Bernardini, who was spending a year at Columbia University. There Bernardini mentioned the idea of a European laboratory to Isidor Rabi, who, at the same time, was in contact with other prominent figures in this decentralised and multi-centered initiative. Together with Norman Ramsay, Rabi had previously succeeded, in 1947, in persuading nine northeastern US universities to collaborate under the banner of Associated Universities, Inc, which led to the establishment of Brookhaven National Laboratory.

What is not generally known is that before Rabi gave his famous speech at the fifth assembly of UNESCO in Florence in June 1950, he came to Rome and met with my father. They discussed how to bring this idea to fruition. A few days later, Rabi’s resolution at the UNESCO meeting calling for regional research facilities was a crucial step in launching the project. Rabi considered CERN a peaceful compensation for the fact that physicists had built the nuclear bomb.

How did your father and his colleagues proceed after the UNESCO resolution?

Following the UNESCO meeting, Pierre Auger, at that time director of exact and natural sciences at UNESCO, and my father took on the task of advancing the project. In September 1950 Auger spoke of it at a nuclear physics conference in Oxford, and at a meeting of the International Union of Pure and Applied Physics (IUPAP), my father– one of the vice presidents – urged the executive committee to consider how best to implement the Florence resolution. In May 1951, Auger and my father organised a meeting of experts at UNESCO headquarters in Paris, where a compelling justification for the European project was drafted.

The cost of such an endeavour was beyond the means of any single nation. This led to an intergovernmental conference under the auspices of UNESCO in December 1951, where the foundations for CERN were laid. Funding, totalling $10,000 for the initial meetings of the board of experts, came from Italy, France and Belgium. This was thanks to the financial support of men like Gustavo Colonnetti, president of the Italian Research Council, who had already – a year before – donated the first funds to UNESCO.

Were there any significant challenges during this period?

Not everyone readily accepted the idea of a European laboratory. Eminent physicists like Niels Bohr, James Chadwick and Hendrik Kramers questioned the practicality of starting a new laboratory from scratch. They were concerned about the feasibility and allocation of resources, and preferred the coordination of many national laboratories and institutions. Through skilful negotiation and compromise, Auger and my father incorporated some of the concerns raised by the sceptics into a modified version of the project, ensuring broader support. In February 1952 the first agreement setting up a provisional council for CERN was written and signed, and my father was nominated secretary general of the provisional CERN.

Enrico and Giulio Fermi, Ginestra Amaldi, Laura Fermi, Edoardo and Ugo Amaldi

He worked tirelessly, travelling through Europe to unite the member states and start the laboratory’s construction. In particular, the UK was reluctant to participate fully. They had their own advanced facilities, like the 40 MeV cyclotron at the University of Liverpool. In December 1952 my father visited John Cockcroft, at the time director of the Harwell Atomic Energy Research Establishment, to discuss this. There’s an interesting episode where my father, with Cockcroft, met Frederick Lindemann and Baron Cherwell, who was a long-time scientific advisor to Winston Churchill. Cherwell dismissed CERN as another “European paper mill.” My father, usually composed, lost his temper and passionately defended the project. During the following visit to Harwell, Cockcroft reassured him that his reaction was appropriate. From that point on, the UK contributed to CERN, albeit initially as a series of donations rather than as the result of a formal commitment. It may be interesting to add that, during the same visit to London and Harwell, my father met the young John Adams and was so impressed that he immediately offered him a position at CERN.

What were the steps following the ratification of CERN’s convention?

Robert Valeur, chairman of the council during the interim period, and Ben Lockspeiser, chairman of the interim finance committee, used their authority to stir up early initiatives and create an atmosphere of confidence that attracted scientists from all over Europe. As Lew Kowarski noted, there was a sense of “moral commitment” to leave secure positions at home and embark on this new scientific endeavour.

During the interim period from May 1952 to September 1954, the council convened three sessions in Geneva whose primary focus was financial management. The organisation began with an initial endowment of approximately 1 million Swiss Francs, which – as I said – included a contribution from the UK known as the “observer’s gift”. At each subsequent session, the council increased its funding, reaching around 3.7 million Swiss Francs by the end of this period. When the permanent organisation was established, an initial sum of 4.1 million Swiss Francs was made available.

Giuseppe Fidecaro, Edoardo Amaldi and Werner Heisenberg at CERN in 1960

In 1954, my father was worried that if the parliaments didn’t approve the convention before winter, then construction would be delayed because of the wintertime. So he took a bold step and, with the approval of the council president, authorised the start of construction on the main site before the convention was fully ratified.

This led to Lockspeiser jokingly remarking later that council “has now to keep Amaldi out of jail”. The provisional council, set up in 1952, was dissolved when the European Organization for Nuclear Research officially came into being in 1954, though the acronym CERN (Conseil Européen pour la Recherche Nucléaire) was retained. By the conclusion of the interim period, CERN had grown significantly. A critical moment occurred on 29 September  1954, when a specific point in the ratification procedure was reached, rendering all assets temporarily ownerless. During this eight-day period, my father, serving as secretary general, was the sole owner on behalf of the newly forming permanent organisation. The interim phase concluded with the first meeting of the permanent council, marking the end of CERN’s formative years.

Did your father ever consider becoming CERN’s Director-General?

People asked him to be Director-General, but he declined for two reasons. First, he wanted to return to his students and his cosmic-ray research in Rome. Second, he didn’t want people to think he had done all this to secure a prominent position. He believed in the project for its own sake.

When the convention was finally ratified in 1954, the council offered the position of Director-General to Felix Bloch, a Swiss–American physicist and Nobel Prize winner for his work on nuclear magnetic resonance. Bloch accepted but insisted that my father serve as his deputy. My father, dedicated to CERN’s success, agreed to this despite his desire to return to Rome full time.

How did that arrangement work out?

My father agreed but Bloch wasn’t at that time rooted in Europe. He insisted on bringing all his instruments from Stanford so he could continue his research on nuclear magnetic resonance at CERN. He found it difficult to adapt to the demands of leading CERN and soon resigned. The council then elected Cornelis Jan Bakker, a Dutch physicist who had led the synchrocyclotron group, as the new Director-General. From the beginning, he was the person my father thought would have been the ideal director for the initial phase of CERN. Tragically though, Bakker died in a plane crash a year and a half later. I well remember how hard my father was hit by this loss.

How did the development of accelerators at CERN progress?

The decision to adopt the strong focusing principle for the Proton Synchrotron (PS) was a pivotal moment. In August 1952 Otto Dahl, leader of the Proton Synchrotron study group, Frank Goward and Rolf Widerøe visited Brookhaven just as Ernest Courant, Stanley Livingston and Hartland Snyder were developing this new principle. They were so excited by this development that they returned to CERN determined to incorporate it into the PS design. In 1953 Mervyn Hine, a long-time friend of John Adams with whom he had moved to CERN, studied potential issues with misalignment in strong focusing magnets, which led to further refinements in the design. Ultimately, the PS became operational before the comparable accelerator at Brookhaven, marking a significant achievement for European science.

Edoardo Amaldi and Victor Weisskopf in 1974

It’s important here to recognise the crucial contributions of the engineers, who often don’t receive the same level of recognition as physicists. They are the ones who make the work of experimental physicists and theorists possible. “Viki” Weisskopf, Director-General of CERN from 1961 to 1965, compared the situation to the discovery of America. The machine builders are the captains and shipbuilders. The experimentalists are those fellows on the ships who sailed to the other side of the world and wrote down what they saw. The theoretical physicists are those who stayed behind in Madrid and told Columbus that he was going to land in India.

Your father also had a profound impact on the development of other Big Science organisations in Europe

Yes, in 1958 my father was instrumental, together with Pierre Auger, in the founding of the European Space Agency. In a letter written in 1958 to his friend Luigi Crocco, who was professor of jet propulsion in Princeton, he wrote that “it is now very much evident that this problem is not at the level of the single states like Italy, but mainly at the continental level. Therefore, if such an endeavour is to be pursued, it must be done on a European scale, as already done for the building of the large accelerators for which CERN was created… I think it is absolutely imperative for the future organisation to be neither military nor linked to any military organisation. It must be a purely scientific organisation, open – like CERN – to all forms of cooperation and outside the participating countries.” This document reflects my father’s vision of peaceful and non-military European science.

How is it possible for one person to contribute so profoundly to science and global collaboration?

My father’s ability to accept defeats and keep pushing forward was key to his success. He was an exceptional person with a clear vision and unwavering dedication. I hope that by sharing these stories, others might be inspired to pursue their goals with the same persistence and passion.

Could we argue that he was not only a visionary but also a relentless advocate?

He travelled extensively, talked to countless people, and was always cheerful and energetic. He accepted setbacks but kept moving forwards. In this connection, I want to mention Eliane Bertrand, later de Modzelewska, his secretary in Rome who later became secretary of the CERN Council for about 20 years, serving under several Director-Generals. She left a memoir about those early days, highlighting how my father was always travelling, talking and never stopping. It’s a valuable piece of history that, I think, should be published.

Eliane de Modzelewska

International collaboration has been a recurring theme in your own career. How do you view its importance today?

International collaboration is more critical than ever in today’s world. Science has always been a bridge between cultures and nations, and CERN’s history is a testimony of what this brings to humanity. It transcends political differences and fosters mutual understanding. I hope CERN and the broader scientific community will find ways to maintain these vital connections with all countries. I’ve always believed that fostering a collaborative and inclusive environment is one of the main goals of us scientists. It’s not just about achieving results but also about how we work together and support each other along the way.

Looking ahead, what are your thoughts on the future of CERN and particle physics?

I firmly believe that pursuing higher collision energies is essential. While the Large Hadron Collider has achieved remarkable successes, there’s still much we haven’t uncovered – especially regarding supersymmetry. Even though minimal supersymmetry does not apply, I remain convinced that supersymmetry might manifest in ways we haven’t yet understood. Exploring higher energies could reveal supersymmetric particles or other new phenomena.

Like most European physicists, I support the initiative of the Future Circular Collider and starting with an electron–positron collider phase so to explore new frontiers at two very different energy levels. However, if geopolitical shifts delay or complicate these plans, we should consider pushing hard on alternative strategies like developing the technologies for muon colliders.

Ugo Amaldi first arrived at CERN as a fellow in September 1961. Then, for 10 years at the ISS in Rome, he opened two new lines of research: quasi-free electron scattering on nuclei and atoms. Back at CERN, he developed the Roman pots experimental technique, was a co-discoverer of the rise of the proton–proton cross-section with energy, measured the polarisation of muons produced by neutrinos, proposed the concept of a superconducting electron–positron linear collider, and led LEP’s DELPHI Collaboration. Today, he advances the use of accelerators in cancer treatment as the founder of the TERA Foundation for hadron therapy and as president emeritus of the National Centre for Oncological Hadrontherapy (CNAO) in Pavia. He continues his mother and father’s legacy of authoring high-school physics textbooks used by millions of Italian pupils. His motto is: “Physics is beautiful and useful.”

This interview first appeared in the newsletter of CERN’s experimental physics department. It has been edited for concision.

The post Edoardo Amaldi and the birth of Big Science appeared first on CERN Courier.

]]>
Feature In an interview drawing on memories from childhood and throughout his own distinguished career at CERN, Ugo Amaldi offers deeply personal insights into his father Edoardo’s foundational contributions to international cooperation in science. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_AMALDI_Ugo_feature.jpg
Charm and synthesis https://cerncourier.com/a/charm-and-synthesis/ Mon, 27 Jan 2025 07:43:29 +0000 https://cerncourier.com/?p=112128 Sheldon Glashow recalls the events surrounding a remarkable decade of model building and discovery between 1964 and 1974.

The post Charm and synthesis appeared first on CERN Courier.

]]>
In 1955, after a year of graduate study at Harvard, I joined a group of a dozen or so students committed to studying elementary particle theory. We approached Julian Schwinger, one of the founders of quantum electrodynamics, hoping to become his thesis students – and we all did.

Schwinger lined us up in his office, and spent several hours assigning thesis subjects. It was a remarkable performance. I was the last in line. Having run out of well-defined thesis problems, he explained to me that weak and electromagnetic interactions share two remarkable features: both are vectorial and both display aspects of universality. Schwinger suggested that I create a unified theory of the two interactions – an electroweak synthesis. How I was to do this he did not say, aside from slyly hinting at the Yang–Mills gauge theory.

By the summer of 1958, I had convinced myself that weak and electromagnetic interactions might be described by a badly broken gauge theory, and Schwinger that I deserved a PhD. I had hoped to partly spend a postdoctoral fellowship in Moscow at the invitation of the recent Russian Nobel laureate Igor Tamm, and sought to visit Niels Bohr’s institute in Copenhagen while awaiting my Soviet visa. With Bohr’s enthusiastic consent, I boarded the SS Île de France with my friend Jack Schnepps. Following a memorable and luxurious crossing – one of the great ship’s last – Jack drove south to Padova to work with Milla Baldo-Ceolin’s emulsion group in Padova, and I took the slow train north to Copenhagen. Thankfully, my Soviet visa never arrived. I found the SU(2) × U(1) structure of the electroweak model in the spring of 1960 at Bohr’s famous institute at Blegsdamvej 19, and wrote the paper that would earn my share of the 1979 Nobel Prize.

We called the new quark flavour charm, completing two weak doublets of quarks to match two weak doublets of leptons, and establishing lepton–quark symmetry, which holds to this day

A year earlier, in 1959, Augusto Gamba, Bob Marshak and Susumo Okubo had proposed lepton–hadron symmetry, which regarded protons, neutrons and lambda hyperons as the building blocks of all hadrons, to match the three known leptons at the time: neutrinos, electrons and muons. The idea was falsified by the discovery of a second neutrino in 1962, and superseded in 1964 by the invention of fractionally charged hadron constituents, first by George Zweig and André Petermann, and then decisively by Murray Gell-Mann with his three flavours of quarks. Later in 1964, while on sabbatical in Copenhagen, James Bjorken and I realised that lepton–hadron symmetry could be revived simply by adding a fourth quark flavour to Gell-Mann’s three. We called the new quark flavour “charm”, completing two weak doublets of quarks to match two weak doublets of leptons, and establishing lepton–quark symmetry, which holds to this day.

Annus mirabilis

1964 was a remarkable year. In addition to the invention of quarks, Nick Samios spotted the triply strange Ω baryon, and Oscar Greenberg devised what became the critical notion of colour. Arno Penzias and Robert Wilson stumbled on the cosmic microwave background radiation. James Cronin, Val Fitch and others discovered CP violation. Robert Brout, François Englert, Peter Higgs and others invented spontaneously broken non-Abelian gauge theories. And to top off the year, Abdus Salam rediscovered and published my SU(2) × U(1) model, after I had more-or-less abandoned electroweak thoughts due to four seemingly intractable problems.

Four intractable problems of early 1964

How could the W and Z bosons acquire masses while leaving the photon massless?

Steven Weinberg, my friend from both high-school and college, brilliantly solved this problem in 1967 by subjecting the electroweak gauge group to spontaneous symmetry breaking, initiating the half-century-long search for the Higgs boson. Salam published the same solution in 1968.

How could an electroweak model of leptons be extended to describe the weak interactions of hadrons?

John Iliopoulos, Luciano Maiani and I solved this problem in 1970 by introducing charm and quark-lepton symmetry to avoid unobserved strangeness-changing neutral currents.

Was the spontaneously broken electroweak gauge model mathematically consistent?

Gerard ’t Hooft announced in 1971 that he had proven Steven Weinberg’s electroweak model to be renormalisable. In 1972, Claude Bouchiat, John Iliopoulos and Philippe Meyer demonstrated the electroweak model to be free of Adler anomalies provided that lepton–quark symmetry is maintained.

Could the electroweak model describe CP violation without invoking additional spinless fields?

In 1973, Makoto Kobayashi and Toshihide Maskawa showed that the electroweak model could easily and naturally violate CP if there are more than four quark flavours.

Much to my surprise and delight, all of them would be solved within just a few years, with the last theoretical obstacle removed by Makoto Kobayashi and Toshihide Maskawa in 1973 (see “Four intractable problems” panel). A few months later, Paul Musset announced that CERN’s Gargamelle detector had won the race to detect weak neutral-current interactions, giving the electroweak model the status of a predictive theory. Remarkably, the year had begun with Gell-Mann, Harald Fritzsch and Heinrich Leutwyler proposing QCD, and David Gross, Frank Wilczek and David Politzer showing it to be asymptotically free. The Standard Model of particle physics was born.

Charmed findings

But where were the charmed quarks? Early on Monday morning on 11 November, 1974, I was awakened by a phone call from Sam Ting, who asked me to come to his MIT office as soon as possible. He and Ulrich Becker were waiting for me impatiently. They showed me an amazingly sharp resonance. Could it be a vector meson like the ρ or ω and be so narrow, or was it something quite different? I hopped in my car and drove to Harvard, where my colleagues Alvaro de Rújula and Howard Georgi excitedly regaled me about the Californian side of the story. A few days later, experimenters in Frascati confirmed the BNL–SLAC discovery, and de Rújula and I submitted our paper “Is Bound Charm Found?” – one of two papers on the J/ψ discovery printed in Physical Review Letters on 5 July 1965 that would prove to be correct. Among five false papers was one written by my beloved mentor, Julian Schwinger.

Sam Ting at CERN in 1976

The second correct paper was by Tom Appelquist and David Politzer. Well before that November, they had realised (without publishing) that bound states of a charmed quark and its antiquark lying below the charm threshold would be exceptionally narrow due the asymptotic freedom of QCD. De Rújula suggested to them that such a system be called charmonium in an analogy with positronium. His term made it into the dictionary. Shortly afterward, the 1976 Nobel Prize in Physics was jointly awarded to Burton Richter and Sam Ting for “their pioneering work in the discovery of a heavy elementary particle of a new kind” – evidence that charm was not yet a universally accepted explanation. Over the next few years, experimenters worked hard to confirm the predictions of theorists at Harvard and Cornell by detecting and measuring the masses, spins and transitions among the eight sub-threshold charmonium states. Later on, they would do the same for 14 relatively narrow states of bottomonium.

Abdus Salam, Tom Ball and Paul Musset

Other experimenters were searching for particles containing just one charmed quark or antiquark. In our 1975 paper “Hadron Masses in a Gauge Theory”, de Rújula, Georgi and I included predictions of the masses of several not-yet-discovered charmed mesons and baryons. The first claim to have detected charmed particles was made in 1975 by Robert Palmer and Nick Samios at Brookhaven, again with a bubble-chamber event. It seemed to show a cascade decay process in which one charmed baryon decays into another charmed baryon, which itself decays. The measured masses of both of the charmed baryons were in excellent agreement with our predictions. Though the claim was not widely accepted, I believe to this day that Samios and Palmer were the first to detect charmed particles.

Sheldon Glashow and Steven Weinberg

The SLAC electron–positron collider, operating well above charm threshold, was certainly producing charmed particles copiously. Why were they not being detected? I recall attending a conference in Wisconsin that was largely dedicated to this question. On the flight home, I met my old friend Gerson Goldhaber, who had been struggling unsuccessfully to find them. I think I convinced him to try a bit harder. A couple of weeks later in 1976, Goldhaber and François Pierre succeeded. My role in charm physics had come to a happy ending. 

  • This article is adapted from a presentation given at the Institute of High-Energy Physics in Beijing on 20 October 2024 to celebrate the 50th anniversary of the discovery of the J/ψ.

The post Charm and synthesis appeared first on CERN Courier.

]]>
Feature Sheldon Glashow recalls the events surrounding a remarkable decade of model building and discovery between 1964 and 1974. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_GLASHOW_lectures.jpg
Inside pyramids, underneath glaciers https://cerncourier.com/a/inside-pyramids-underneath-glaciers/ Wed, 20 Nov 2024 13:48:19 +0000 https://cern-courier.web.cern.ch/?p=111476 Coordinated by editors Paola Scampoli and Akitaka Ariga, Cosmic Ray Muography provides an invaluable snapshot of a booming research area.

The post Inside pyramids, underneath glaciers appeared first on CERN Courier.

]]>
Muon radiography – muography for short – uses cosmic-ray muons to probe and image large, dense objects. Coordinated by editors Paola Scampoli and Akitaka Ariga of the University of Bern, the authors of this book provide an invaluable snapshot of this booming research area. From muon detectors, which differ significantly from those used in fundamental physics research, to applications of muography in scientific, cultural, industrial and societal scenarios, a broad cross section of experts describe the physical principles that underpin modern muography.

Hiroyuki Tanaka of the University of Tokyo begins the book with historical developments and perspectives. He guides readers from the first documented use of cosmic-ray muons in 1955 for rock overburden estimation, to current studies of the sea-level dynamics in Tokyo Bay using muon detectors laid on the seafloor and visionary ideas to bring muography to other planets using teleguided rovers.

Scattering methods

Tanaka limits his discussion to the muon-absorption approach to muography, which images an object by comparing the muon flux before and after – or with and without – an object. The muon-scattering approach, which was invented two decades ago, instead exploits the deflection of muons passing through matter that is due to electromagnetic interactions with nuclei. The interested reader will find several examples of the application of muon scattering in other chapters, particularly that on civil and industrial applications by Davide Pagano (Pavia) and Altea Lorenzon (Padova). Scattering methods have an edge in these fields thanks to their sensitivity to the atomic number of the materials under investigation.

Cosmic Ray Muography

Peter Grieder (Bern), who sadly passed away shortly before the publication of the book, gives an excellent and concise introduction to the physics of cosmic rays, which Paolo Checchia (Padova) expands on, delving into the physics of interactions between muons and matter. Akira Nishio (Nagoya University) describes the history and physical principles of nuclear emulsions. These detectors played an important role in the history of particle physics, but are not very popular now as they cannot provide real-time information. Though modern detectors are a more common choice today, nuclear emulsions still find a niche in muography thanks to their portability. The large accumulation of data from muography experiments requires automatic analysis, for which dedicated scanning systems have been developed. Nishio includes a long and insightful discussion on how the nuclear-emulsions community reacted to supply-chain evolution. The transition from analogue to digital cameras meant that most film-producing firms changed their core business or simply disappeared, and researchers had to take a large part of the production process into their own hands.

Fabio Ambrosino and Giulio Saracino of INFN Napoli next take on the task of providing an overview of the much broader and more popular category of real-time detectors, such as those commonly used in experiments at particle colliders. Elaborating on the requirements set by the cosmic rate and environmental factors, their
chapter explains why scintillator and gas-based tracking devices are the most popular options in muography. They also touch on more exotic detector options, including Cherenkov telescopes and cylindrical tracking detectors that fit in boreholes.

In spite of their superficial similarity, methods that are common in X-ray imaging need quite a lot of ingenuity to be adapted to the context of muography. For example, the source cannot be controlled in muography, and is not mono­chromatic. Both energy and direction are random and have a very broad distribution, and one cannot afford to take data from more than a few viewpoints. Shogo Nagahara and Seigo Miyamoto of the University of Tokyo provide a specialised but intriguing insight into 3D image reconstruction using filtered back-projection.

A broad cross section of experts describe the physical principles that underpin modern muography

Geoscience is among the most mature applications of muography. While Jacques Marteau (Claude Bernard University Lyon 1) provides a broad overview of decades of activities spanning from volcano studies to the exploration of natural caves, Ryuichi Nishiyama (Tokyo) explores recent studies where muography provided unique data on the shape of the bedrock underneath two major glaciers in the Swiss Alps.

One of the greatest successes of muography is the study of pyramids, which is given ample space in the chapter on archaeology by Kunihiro Morishima (Nagoya). In 1971, Nobel-laureate Luis Alvarez’s team pioneered the use of muography in archaeology during an investigation at the pyramid of Khafre in Giza, Egypt, motivated by his hunch that an unknown large chamber could be hiding in the pyramid. Their data convincingly excluded that possibility, but the attempt can be regarded as launching modern muography (CERN Courier May/June 2023 p32). Half a century later, muography was reintroduced to the exploration of Egyptian pyramids thanks to ScanPyramids – an international project led by particle-physics teams in France and Japan under the supervision of the Heritage Innovation and Preservation Institute. ScanPyramids aims at systematically surveying all of the main pyramids in the Giza complex, and recently made headlines by finding a previously unknown corridor-shaped cavity in Khufu’s Great Pyramid, which is the second largest pyramid in the world. To support the claim, which was initially based on muography alone, the finding was cross-checked with the more traditional surveying method based on ground penetrating radar, and finally confirmed via visual inspection through an endoscope.

Pedagogical focus

This book is a precious resource for anyone approaching muography, from students to senior scientists, and potential practitioners from both academic and industrial communities. There are some other excellent books that have already been published on the same topic, and that have showcased original research, but Cosmic Ray Muography’s pedagogical focus, which prioritises the explanation of timeless first principles, will not become outdated any time soon. Given each chapter was written independently, there is a certain degree of overlap and some incoherence in terminology, but this gives the reader valuable exposure to different perspectives about what matters most in this type of research.

The post Inside pyramids, underneath glaciers appeared first on CERN Courier.

]]>
Review Coordinated by editors Paola Scampoli and Akitaka Ariga, Cosmic Ray Muography provides an invaluable snapshot of a booming research area. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_REV_muons-1.jpg
Dignitaries mark CERN’s 70th anniversary https://cerncourier.com/a/dignitaries-mark-cerns-70th-anniversary/ Wed, 20 Nov 2024 13:22:30 +0000 https://cern-courier.web.cern.ch/?p=111412 On 1 October a high-level ceremony at CERN marked 70 years of science, innovation and collaboration.

The post Dignitaries mark CERN’s 70th anniversary appeared first on CERN Courier.

]]>
On 1 October a high-level ceremony at CERN marked 70 years of science, innovation and collaboration. In attendance were 38 national delegations, including eight heads of state or government and 13 ministers, along with many scientific, political and economic leaders who demonstrated strong support for CERN’s mission and future ambition. “CERN has become a global hub because it rallied Europe, and this is even more crucial today,” said president of the European Commission Ursula von der Leyen. “China is planning a 100 km collider to challenge CERN’s global leadership. Therefore, I am proud that we have financed the feasibility study for CERN’s Future Circular Collider. As the global science race is on, I want Europe to switch gear.” CERN’s year-long 70th anniversary programme has seen more than 100 events organised in 63 cities in 28 countries, bringing together thousands of people to discuss the wonders and applications of particle physics. “I am very honoured to welcome representatives from our Member and Associate Member States, our Observers and our partners from all over the world on this very special day,” said CERN Director-General Fabiola Gianotti. “CERN is a great success for Europe and its global partners, and our founders would be very proud to see what CERN has accomplished over the seven decades of its life.”

The post Dignitaries mark CERN’s 70th anniversary appeared first on CERN Courier.

]]>
News On 1 October a high-level ceremony at CERN marked 70 years of science, innovation and collaboration. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_NA_70th-1-1.jpg
The new particles https://cerncourier.com/a/the-new-particles/ Fri, 15 Nov 2024 13:43:09 +0000 https://cern-courier.web.cern.ch/?p=111393 Fifty years ago, the discovery of the J/ψ and its excitations sparked the November Revolution in particle physics, giving fresh experimental impetus to the theoretical ideas that would become the Standard Model.

The post The new particles appeared first on CERN Courier.

]]>
Sam Ting in November 1974

Anyone in touch with the world of high-energy physics will be well aware of the ferment created by the news from Brookhaven and Stanford, followed by Frascati and DESY, of the existence of new particles. But new particles have been unearthed in profusion by high-energy accelerators during the past 20 years. Why the excitement over the new discoveries?

A brief answer is that the particles have been found in a mass region where they were completely unexpected with stability properties which, at this stage of the game, are completely inexplicable. In this article we will first describe the discoveries and then discuss some of the speculations as to what the discoveries might mean.

We begin at the Brookhaven National Laboratory where, since the Spring of this year, a MIT/Brookhaven team have been looking at collisions between two protons which yielded (amongst other things) an electron and a positron. A series of experiments on the production of electron–positron pairs in particle collisions has been going on for about eight years in groups led by Sam Ting, mainly at the DESY synchrotron in Hamburg. The aim is to study some of the electromagnetic features of particles where energy is manifest in the form of a photon which materialises in an electron–positron pair. The experiments are not easy to do because the probability that the collisions will yield such a pair is very low. The detection system has to be capable of picking out an event from a million or more other types of event.

Beryllium bombardment

It was with long experience of such problems behind them that the MIT/Brookhaven team led by Ting, J J Aubert, U J Becker and P J Biggs brought into action a detection system with a double arm spectrometer in a slow ejected proton beam at the Brookhaven 33 GeV synchrotron. They used beams of 28.5 GeV bombarding a beryllium target. The two spectrometer arms span out at 15° either side of the incident beam direction and have magnets, Cherenkov counters, multiwire proportional chambers, scintillation counters and lead glass counters. With this array, it is possible to identify electrons and positrons coming from the same source and to measure their energy.

From about August, the realisation that they were on to something important began slowly to grow. The spectrometer was totting up an unusually large number of events where the combined energies of the electron and positron were equal to 3.1 GeV.

The detection system of the experiment at Brookhaven that spotted the new particle

This is the classic way of spotting a resonance. An unstable particle, which breaks up too quickly to be seen itself, is identified by adding up the energies of more stable particles which emerge from its decay. Looking at many interactions, if energies repeatedly add up to the same figure (as opposed to the other possible figures all around it), they indicate that the measured particles are coming from the break up of an unseen particle whose mass is equal to the measured sum.

The team went through extraordinary contortions to check their apparatus to be sure that nothing was biasing their results. The particle decaying into the electron and positron they were measuring was a difficult one to swallow. The energy region had been scoured before, even if not so thoroughly, without anything being seen. Also the resonance was looking “narrow” – this means that the energy sums were coming out at 3.1 GeV with great precision rather than, for example, spanning from 2.9 to 3.3 GeV. The width is a measure of the stability of the particle (from Heisenberg’s Uncertainty Principle, which requires only that the product of the average lifetime and the width be a constant). A narrow width means that the particle lives a long time. No other particle of such a heavy mass (over three times the mass of the proton) has anything like that stability.

By the end of October, the team had about 500 events from a 3.1 GeV particle. They were keen to extend their search to the maximum mass their detection system could pin down (about 5.5 GeV) but were prodded into print mid-November by dramatic news from the other coast of America. They baptised the particle J, which is a letter close to the Chinese symbol for “ting”. From then on, the experiment has had top priority. Sam Ting said that the Director of the Laboratory, George Vineyard, asked him how much time on the machine he would need – which is not the way such conversations usually go.

The apparition of the particle at the Stanford Linear Accelerator Center on 10 November was nothing short of shattering. Burt Richter described it as “the most exciting and frantic week-end in particle physics I have ever been through”. It followed an upgrading of the electron–positron storage ring SPEAR during the late Summer.

Until June, SPEAR was operating with beams of energy up to 2.5 GeV so that the total energy in the collision was up to a peak of 5 GeV. The ring was shut down during the late summer to install a new RF system and new power supplies so as to reach about 4.5 GeV per beam. It was switched on again in September and within two days beams were orbiting the storage ring again. Only three of the four new RF cavities were in action so the beams could only be taken to 3.8 GeV. Within two weeks the luminosity had climbed to 5 × 1030cm–2 s–1 (the luminosity dictates the number of interactions the physicists can see) and time began to be allocated to experimental teams to bring their detection systems into trim.

It was the Berkeley/Stanford team led by Richter, M Perl, W Chinowsky, G Goldhaber and G H Trilling who went into action during the week-end 9–10 November to check back on some “funny” readings they had seen in June. They were using a detection system consisting of a large solenoid magnet, wire chambers, scintillation counters and shower counters, almost completely surrounding one of the two intersection regions where the electrons and positrons are brought into head-on collision.

Put through its paces

During the first series of measurements with SPEAR, when it went through its energy paces, the cross-section (or probability of an interaction between an electron and positron occurring) was a little high at 1.6 GeV beam energy (3.2 GeV collision energy) compared with at the neighbouring beam energies. The June exercise, which gave the funny readings, was a look over this energy region again. Cross-sections were measured with electrons and positrons at 1.5, 1.55, 1.6 and 1.65 GeV. Again 1.6 GeV was a little high but 1.55 GeV was even more peculiar. In eight runs, six measurements agreed with the 1.5 GeV data while two were higher (one of them five-times higher). So, obviously, a gremlin had crept in to the apparatus. While meditating during the transformation from SPEAR I to SPEAR II, the gremlin was looked for but not found. It was then that the suspicion grew that between 3.1 and 3.2 GeV collision energies could lie a resonance.

During the night of 9–10 November the hunt began, changing the beam energies in 0.5 MeV steps. By 11.00 a.m. Sunday morning the new particle had been unequivocally found. A set of cross-section measurements around 3.1 GeV showed that the probability of interaction jumped by a factor of 10 from 20 to 200 nanobarns. In a state of euphoria, the champagne was cracked open and the team began celebrating an important discovery. Gerson Goldhaber retired in search of peace and quiet to write the findings for immediate publication.

The detection system at the SPEAR storage ring at Stanford

While he was away, it was decided to polish up the data by going slowly over the resonance again. The beams were nudged from 1.55 to 1.57 and everything went crazy. The interaction probability soared higher; from around 20 nanobarns the cross-section jumped to 2000 nanobarns and the detector was flooded with events producing hadrons. Pief Panofsky, the Director of SLAC, arrived and paced around invoking the Deity in utter amazement at what was being seen. Gerson Goldhaber then emerged with his paper proudly announcing the 200 nanobarn resonance and had to start again, writing 10 times more proudly.

Within hours of the SPEAR measurements, the telephone wires across the Atlantic were humming as information enquiries and rumours were exchanged. As soon as it became clear what had happened, the European Laboratories looked to see how they could contribute to the excitement. The obvious candidates, to be in on the act quickly, were the electron–positron storage rings at Frascati and DESY.

From 13 November, the experimental teams on the ADONE storage ring (from Frascati and the INFN sections of the universities of Naples, Padua, Pisa and Rome) began to search in the same energy region. They have detection systems for three experiments known as gamma–gamma (wide solid angle detector with high efficiency for detecting neutral particles), MEA (solenoidal magnetic spectrometer with wide gap spark chambers and shower detectors) and baryon–antibaryon (coaxial hodoscopes of scintillators covering a wide solid angle). The ADONE operators were able to jack the beam energy up a little above its normal peak of 1.5 GeV and on 15 November the new particle was seen in all three detection systems. The data confirmed the mass and the high stability. The experiments are continuing using the complementary abilities of the detectors to gather as much information as possible on the nature of the particle.

At DESY, the DORIS storage ring was brought into action with the PLUTO and DASP detection systems described later in this issue on page 427. During the week-end of 23–24 November, a clear signal at about 3.1 GeV total energy was seen in both detectors, with PLUTO measuring events with many emerging hadrons and DASP measuring two emerging particles. The angular distribution of elastic electron–positron scattering was measured at 3.1 GeV, and around it, and a distinct change was seen. The detectors are now concentrating on measuring branching ratios – the relative rate at which the particle decays in different ways.

Excitation times

In the meantime, SPEAR II had struck again. On 21 November, another particle was seen at 3.7 GeV. Like the first it is a very narrow resonance indicating the same high stability. The Berkeley/Stanford team have called the particles psi (3105) and psi (3695).

No-one had written the recipe for these particles and that is part of what all the excitement is about. At this stage, we can only speculate about what they might mean.  First of all, for the past year, something has been expected in the hadron–lepton relationship. The leptons are particles, like the electron, which we believe do not feel the strong force. Their interactions, such as are initiated in an electron–positron storage ring, can produce hadrons (or strong force particles) via their common electromagnetic features. On the basis of the theory that hadrons are built up of quarks (a theory that has a growing weight of experimental support – see CERN Courier October 1974 pp331–333), it is possible to calculate relative rates at which the electron–positron interaction will yield hadrons and the rate should decrease as the energy goes higher. The results from the Cambridge bypass and SPEAR about a year ago showed hadrons being produced much more profusely than these predictions.

What seems to be the inverse of this observation is seen at the CERN Intersecting Storage Rings and the 400 GeV synchrotron at the FermiLab. In interactions between hadrons, such as proton–proton collisions, leptons are seen coming off at much higher relative rates than could be predicted. Are the new particles behind this hadron–lepton mystery? And if so, how?

Signs of a revolution

Other speculations are that the particles have new properties to add to the familiar ones like charge, spin, parity… As the complexity of particle behaviour has been uncovered, names have had to be selected to describe different aspects. These names are linked, in the mathematical description of what is going on, to quantum numbers. When particles interact, the quantum numbers are generally conserved – the properties of the particles going into the interaction are carried away, in some perhaps very different combination, by the particles which emerge. If there are new properties, they also will influence what interactions can take place.

To explain what might be happening, we can consider the property called “strangeness”. This was assigned to particles like the neutral kaon and lambda to explain why they were always produced in pairs – the strangeness quantum number is then conserved, the kaon carrying +1, the lambda carrying –1. It is because the kaon has strangeness that it is a very stable particle. It will not readily break up into other particles which do not have this property.

They baptised the particle J, which is a letter close to the Chinese symbol for “ting”

Two new properties have recently been invoked by the theorists – colour and charm. Colour is a suggested property of quarks which makes sense of the statistics used to calculate the consequences of their existence. This gives us nine basic quarks – three coloured varieties of each of the three familiar ones. Charm is a suggested property which makes sense of some observations concerning neutral current interactions (discussed below).

It is the remarkable stability of the new particles which makes it so attractive to invoke colour or charm. From the measured width of the resonances they seem to live for about 10–20 seconds and do not decay rapidly like all the other resonances in their mass range. Perhaps they carry a new quantum number?

Unfortunately, even if the new particles are coloured, since they are formed electromagnetically they should be able to decay the same way and the sums do not give their high stability. In addition, the sums say that there is not enough energy around for them to be built up of charmed constituents. The answer may lie in new properties but not in a way that we can easily calculate.

Yet another possibility is that we are, at last, seeing the intermediate boson. This particle was proposed many years ago as an intermediary of the weak force. Just as the strong force is communicated between hadrons by passing mesons around and the electromagnetic force is communicated between charged particles by passing photons around, it is thought that the weak force could also act via the exchange of a particle rather than “at a point”.

Perhaps the new particles carry a new quantum number?

When it was believed that the weak interactions always involved a change of electric charge between the lepton going into the interaction and the lepton going out, the intermediate boson (often referred to as the W particle) was always envisaged as a charged particle. The CERN discovery of neutral currents in 1973 revealed that a charge change between the leptons need not take place; there could also be a neutral version of the intermediate boson (often referred to as the Z particle). The Z particle can also be treated in the theory which has had encouraging success in uniting the interpretations of the weak and electromagnetic forces.

This work has taken the Z mass into the 70 GeV region and its appearance around 3 GeV would damage some of the beautiful features of the reunification theories. A strong clue could come from looking for asymmetries in the decays of the new particles because, if they are of the Z variety, parity violation should occur.

1974 has been one of the most fascinating years ever experienced in high-energy physics. Still reeling from the neutral current discovery, the year began with the SPEAR hadron production mystery, continued with new high-energy information from the FermiLab and the CERN ISR, including the high lepton production rate, and finished with the discovery of the new particles. And all this against a background of feverish theoretical activity trying to keep pace with what the new accelerators and storage rings have been uncovering.

The post The new particles appeared first on CERN Courier.

]]>
Feature Fifty years ago, the discovery of the J/ψ and its excitations sparked the November Revolution in particle physics, giving fresh experimental impetus to the theoretical ideas that would become the Standard Model. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_NOVEMBER_feature-1-1.jpg
Exploding misconceptions https://cerncourier.com/a/exploding-misconceptions/ Wed, 13 Nov 2024 09:47:45 +0000 https://cern-courier.web.cern.ch/?p=111451 Cosmologist Katie Mack talks to the Courier about how high-energy physics can succeed in #scicomm by throwing open the doors to academia.

The post Exploding misconceptions appeared first on CERN Courier.

]]>
Katie Mack

What role does science communication play in your academic career?

When I was a postdoc I started to realise that the science communication side of my life was really important to me. It felt like I was having a big impact – and in research, you don’t always feel like you’re having that big impact. When you’re a grad student or postdoc, you spend a lot of time dealing with rejection, feeling like you’re not making progress or you’re not good enough. I realised that with science communication, I was able to really feel like I did know something, and I was able to share that with people.

When I began to apply for faculty jobs, I realised I didn’t want to just do science writing as a nights and weekends job, I wanted it to be integrated into my career. Partially because I didn’t want to give up the opportunity to have that kind of impact, but also because I really enjoyed it. It was energising for me and helped me contextualise the work I was doing as a scientist.

How did you begin your career in science communication?

I’ve always enjoyed writing stories and poetry. At some point I figured out that I could write about science. When I went to grad school I took a class on science journalism and the professor helped me pitch some stories to magazines, and I started to do freelance science writing. Then I discovered Twitter. That was even better because I could share every little idea I had with a big audience. Between Twitter and freelance science writing, I garnered quite a large profile in science communication and that led to opportunities to speak and do more writing. At some point I was approached by agents and publishers about writing books.

Who is your audience?

When I’m not talking to other scientists, my main community is generally those who have a high-school education, but not necessarily a university education. I don’t tailor things to people who aren’t interested in science, or try to change people’s minds on whether science is a good idea. I try to help people who don’t have a science background feel empowered to learn about science. I think there are a lot of people who don’t see themselves as “science people”. I think that’s a silly concept but a lot of people conceptualise it that way. They feel like science is closed to them.

The more that science communicators can give people a moment of understanding, an insight into science, I think they can really help people get more involved in science. The best feedback I’ve ever gotten is when students have come up to me and said “I started studying physics because I followed you on Twitter and I saw that I could do this,” or they read my book and that inspired them. That’s absolutely the best thing that comes out of this. It is possible to have a big impact on individuals by doing social media and science communication – and hopefully change the situation in science itself over time.

What were your own preconceptions of academia?

I have been excited about science since I was a little kid. I saw that Stephen Hawking was called a cosmologist, so I decided I wanted to be a cosmologist too. I had this vision in my head that I would be a theoretical physicist. I thought that involved a lot of standing alone in a small room with a blackboard, writing equations and having eureka moments. That’s what was always depicted on TV: you just sit by yourself and think real hard. When I actually got into academia, I was surprised by how collaborative and social it is. That was probably the biggest difference between expectation and reality.

How do you communicate the challenges of academia, alongside the awe-inspiring discoveries and eureka moments?

I think it’s important to talk about what it’s really like to be an academic, in both good ways and bad. Most people outside of academia have no idea what we do, so it’s really valuable to share our experiences, both because it challenges stereotypes in terms of what we’re really motivated by and how we spend our time, but also because there are a lot of people who have the same impression I did: where you just sit alone in a room with a chalkboard. I believe it’s important to be clear about what you actually do in academia, so more people can see themselves happy in the job.

At the same time, there are challenges. Academia is hard and can be very isolating. My advice for early-career researchers is to have things other than science in your life. As a student you’re working on something that potentially no one else cares very much about, except maybe your supervisor. You’re going to be the world-expert on it for a while. It can be hard to go through that and not have anybody to talk to about your work. I think it’s important to acknowledge what people go through and encourage them to get support.

Theoretical physicist Katie Mack

There are of course other parts of academia that can be really challenging, like moving all the time. I went from West coast to East coast between undergrad and grad school, and then from the US to the UK, from the UK to Australia, back to the US and then to Canada. That’s a lot. It’s hard. They’re all big moves so you lose whatever local support system you had and you have to start over in a new place, make new friends and get used to a whole new government bureaucracy.

So there are a whole lot of things that are difficult about academia, and you do need to acknowledge those because a lot of them affect equity. Some of these make it more challenging to have diversity in the field, and they disproportionately affect some groups more than others. It is important to talk about these issues instead of just sweeping people under the rug.

Do you think that social media can help to diversify science and research?

Yes! I think that a large reason why people from underrepresented groups leave science is because they lack the feeling of belonging. If you get into a field and don’t feel like you belong, it’s hard to power through that. It makes it very unpleasant to be there. So I think that one of the ways social media can really help is by letting people see scientists who are not the stereotypical old white men. Talking about what being a scientist is really like, what the lifestyle is like, is really helpful for dismantling those stereotypes.

Your first book, The End of Everything, explored astrophysics but your next will popularise particle physics. Have you had to change your strategy when communicating different subjects?

This book is definitely a lot harder to write. The first one was very big and dramatic: the universe is ending! In this one, I’m really trying to get deeper into how fundamental physics works, which is a more challenging story to tell. The way I’m framing it is through “how to build a universe”. It’s about how fundamental physics connects with the structure of reality, both in terms of what we experience in our daily lives, but also the structure of the universe, and how physicists are working to understand that. I also want to highlight some of the scientists who are doing that work.

So yes, it’s much harder to find a catchy hook, but I think the subject matter and topics are things that people are curious about and have a hunger to understand. There really is a desire amongst the public to understand what the point of studying particle physics is.

Is high-energy physics succeeding when it comes to communicating with the public?

I think that there are some aspects where high-energy physics does a fantastic job. When the Higgs boson was discovered in 2012, it was all over the news and everybody was talking about it. Even though it’s a really tough concept to explain, a lot of people got some inkling of its importance.

A lot of science communication in high-energy physics relies on big discoveries, however recently there have not been that many discoveries at the level of international news. There have been many interesting anomalies in recent years, however in terms of discoveries we had the Higgs and the neutrino mass in 1998, but I’m not sure that there are many others that would really grab your attention if you’re not already invested in physics.

Part of the challenge is just the phase of discovery that particle physics is in right now. We have a model, and we’re trying to find the edges of validity of that model. We see some anomalies and then we fix them, and some might stick around. We have some ideas and theories but they might not pan out. That’s kind of the story we’re working with right now, whereas if you’re looking at astronomy, we had gravitational waves and dark energy. We get new telescopes with beautiful pictures all the time, so it’s easier to communicate and get people excited than it is in particle physics, where we’re constantly refining the model and learning new things. It’s a fantastically exciting time, but there have been no big paradigm shifts recently.

How can you keep people engaged in a subject where big discoveries aren’t constantly being made?

I think it’s hard. There are a few ways to go about it. You can talk about the really massive journey we’re on: this hugely consequential and difficult challenge we’re facing in high-energy physics. It’s a huge task of massive global effort, so you can help people feel involved in the quest to go beyond the Standard Model of particle physics.

You need to acknowledge it’s going to be a long journey before we make any big discoveries. There’s much work to be done, and we’re learning lots of amazing things along the way. We’re getting much higher precision. The process of discovery is also hugely consequential outside of high-energy physics: there are so many technological spin-offs that tie into other fields, like cosmology. Discoveries are being made between particle and cosmological physics that are really exciting.

Every little milestone is an achievement to be celebrated

We don’t know what the end of the story looks like. There aren’t a lot of big signposts along the way where we can say “we’ve made so much progress, we’re halfway there!” Highlighting the purpose of discovery, the little exciting things that we accomplish along the way such as new experimental achievements, and the people who are involved and what they’re excited about – this is how we can get around this communication challenge.

Every little milestone is an achievement to be celebrated. CERN is the biggest laboratory in the world. It’s one of humanity’s crowning achievements in terms of technology and international collaboration – I don’t think that’s an exaggeration. CERN and the International Space Station. Those two labs are examples of where a bunch of different countries, which may or may not get along, collaborate to achieve something that they can’t do alone. Seeing how everyone works together on these projects is really inspiring. If more people were able to get a glimpse of the excitement and enthusiasm around these experiments, it would make a big difference.

The post Exploding misconceptions appeared first on CERN Courier.

]]>
Opinion Cosmologist Katie Mack talks to the Courier about how high-energy physics can succeed in #scicomm by throwing open the doors to academia. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_INT_writing-1.jpg
Back to the future https://cerncourier.com/a/back-to-the-future/ Mon, 16 Sep 2024 14:11:58 +0000 https://preview-courier.web.cern.ch/?p=111034 A photographic journey linking CERN’s early years to ongoing research.

The post Back to the future appeared first on CERN Courier.

]]>
The past seven decades have seen remarkable cultural and technological changes. And CERN has been no passive observer. From modelling European cooperation in the aftermath of World War II to democratising information via the web and discovering a field that pervades the universe, CERN has nudged the zeitgeist more than once since its foundation in 1954.

It’s undeniable, though, that much has stayed the same. A high-energy physics lab still needs to be fast, cool, collaborative, precise, practically useful, deep, diplomatic, creative and crystal clear. Plus ça change, plus c’est la même chose.

This selection of (lightly colourised) snapshots from CERN’s first 25 years, accompanied by expert reflections from across the lab, show how things have changed in the intervening years – and what has stayed the same.

1960

A 5 m diameter magnetic storage ring in 1960

The discovery that electrons and muons possess spin that precesses in a magnetic field has inspired generations of experimentalists and theorists to push the boundaries of precision. The key insight is that quantum effects modify the magnetic moment associated with the particles’ spins, making their gyromagnetic ratios (g) slightly larger than two, the value predicted by Dirac’s equation. For electrons, these quantum effects are primarily due to the electromagnetic force. For muons, the weak and strong forces also contribute measurably – as well, perhaps, as unknown forces. These measurements stand with the most beautiful and precise of all time, and their history is deeply intertwined with that of the Standard Model.

CERN physicists Francis Farley and Emilio Picasso were pioneers and driving forces behind the muon g–2 experimental programme. The second CERN experiment introduced the use of a 5 m diameter magnetic storage ring. Positive muons with 1.3 GeV momentum travelled around the ring until they decayed into positrons whose directions were correlated with the spin of the parent muons. The experiment tested the muon’s anomalous magnetic moment (g-2) with a precision of 270 parts per million. A brilliant concept, the “magic gamma”, was then introduced in the third CERN experiment in the late 1970s: by using muons at a momentum of 3.1 GeV, the effect of electric fields on the precession frequency cancelled out, eliminating a major source of systematic error. All subsequent experiments have relied on this principle, with the exception of an experiment using ultra-cold muons that is currently under construction in Japan. A friendly rivalry for precision between experimentalists and theorists continues today (Lattice calculations start to clarify muon g-2), with the latest measurement at Fermilab achieving a precision of 190 parts per billion.

Andreas Hoecker is spokesperson for the ATLAS collaboration.

1961

Inspecting bubble-chamber images by hand

The excitement of discovering new fundamental particles and forces made the 1950s and 1960s a golden era for particle physicists. A lot of creative energy was channelled into making new particle detectors, such as the liquid hydrogen (or heavier liquid) bubble chambers that paved the way to discoveries such as neutral currents, and seminal studies of neutrinos and strange and charmed baryons. As particles pass through, they make the liquid boil, producing bubbles that are captured to form images. In 1961, each had to be painstakingly inspected by hand, as depicted here, to determine the properties of each particle. Fortunately, in the decades since, physicists have found ways to preserve the level of detail they offer and build on this inspiration to prepare new technologies. Liquid–argon time-projection chambers such as CERN’s DUNE prototypes, which are currently the largest of their kind in the world, effectively give us access to bubble-chamber images in full colour, with the colour representing energy deposition (CERN Courier July/August 2024 p41). Millions of these images are now analysed algorithmically – essential, as DUNE is expected to generate one of the highest data rates in the world.

Laura Munteanu is a CERN staff scientist working on the T2K and DUNE experiments.

1965

The first experiment at CERN to use a superconducting magnet, in 1965

This photograph shows the first experiment at CERN to use a superconducting magnet. The pictured physicist is adjusting a cryostat containing a stack of nuclear emulsions surrounded by a liquid–helium-cooled superconducting niobium–zirconium electromagnet. A pion beam from CERN’s synchro­cyclotron passes through the quadrupole magnet at the right, collimated by the pile of lead bricks and detected by a small plastic scintillation counter before entering the cryostat. In this study of double charge exchange from π+ to π in nuclear emulsions, the experiment consumed between one and two litres of liquid helium per hour from the container in the left foreground, with the vapour being collected for reuse (CERN Courier August 1965 p116).

Today, the LHC is the world’s largest scientific instrument, with more than 24 km of the machine operating at 1.9 K – and yet only one project among many at CERN requiring advanced cryogenics. As presented at the latest international cryogenic engineering conference organised here in July, there have never been so many cryogenics projects either implemented or foreseen. They include accelerators for basic research, light sources, medical accelerators, detectors, energy production and transmission, trains, planes, rockets and ships. The need for energy efficiency and long-term sustainability will necessitate cryogenic technology with an enlarged temperature range for decades to come. CERN’s experience provides a solid foundation for a new generation of engineers to contribute to society.

Serge Claudet is a former deputy group leader of CERN’s cryogenics group.

1966

Mirrors at CERN used to reflect Cherenkov light

Polishing a mirror at CERN in 1966. Are physicists that narcissistic? Perhaps some are, but not in this case. Ultra-polished mirrors are still a crucial part of a class of particle detectors based on the Cherenkov effect. Just as a shock wave of sound is created when an object flies through the sky at a speed greater than the speed of sound in air, so charged particles create a shock wave of light when they pass through a medium at a speed greater than the speed of light in that medium. This effect is extremely useful for measuring the velocity of a charged particle, because the emission angle of light packets relative to the trajectory of the particle is related to the velocity of the particle itself. By measuring the emission angle of Cherenkov light for an ultra-relativistic charged particle travelling through a transparent medium, such as a gas, the velocity of the particle can be determined. Together with the measurement of the particle’s momentum, it is then possible to obtain its identity card, i.e. its mass. Mirrors are used to reflect Cherenkov light to the photosensors. The LHCb experiment at CERN has the most advanced Cherenkov detector ever built. Years go by and technology evolves, but fundamental physics is about reality, and that’s unchangeable!

Vincenzo Vagnoni is spokesperson of the LHCb collaboration.

1970

The Intersecting Storage Rings

In 1911, Heinke Kamerlingh Onnes made a groundbreaking discovery by measuring zero resistance in a mercury wire at 4.2 K, revealing the phenomenon of superconductivity. This earned him the 1913 Nobel Prize, decades in advance of Bardeen, Cooper and Schrieffer’s full theoretical explanation of 1957. It wasn’t until the 1960s that the first superconducting magnets exceeding 1 T were built. This delay stemmed from the difficulty in enabling bulk superconductors to carry large currents in strong magnetic fields – a challenge requiring significant research.

The world’s first proton–proton collider, CERN’s pioneering Intersecting Storage Rings (ISR, pictured below left), began operation in 1971, a year after this photograph was taken. One of its characteristic “X”-shaped vacuum chambers is visible, flanked by combined-function bending magnets on either side. In 1980, to boost its luminosity, eight superconducting quadrupole magnets based on niobium-titanium alloy were installed, each with a 173 mm bore and a peak field of 5.8 T, making the ISR the first collider to use superconducting magnets. Today, we continue to advance superconductivity. For the LHC’s high-luminosity upgrade, we are preparing to install the first magnets based on niobium-tin technology: 24 quadrupoles with a 150 mm aperture and a peak field of 11.3 T.

Susana Izquierdo Bermudez leads CERN’s Large Magnet Facility.

1972

Mary Gaillard and Murray Gell-Mann

The Theoretical Physics Department, or Theory Division as it used to be known, dates back to the foundation of CERN, when it was first established in Copenhagen under the direction of Niels Bohr, before moving to Geneva in 1957. Theory flourished at CERN in the 1960s, hosting many scientists from CERN’s member states and beyond, working side-by-side with experimentalists with a particular focus on strong interactions.

In 1972, when Murray Gell-Mann visited CERN and had this discussion with Mary Gaillard, the world of particle physics was at a turning point. The quark model had been proposed by Gell-Mann in 1964 (similar ideas had been proposed by George Zweig and André Peterman) and the first experimental evidence of their reality had been discovered in deep-inelastic electron scattering at SLAC in 1968. However, the dynamics of quarks was a puzzle. The weak interactions being discussed by Gaillard and Gell-Mann in this picture were also puzzling, though Gerard ’t Hooft and Martinus Veltman had just shown that the unified theory of weak and electromagnetic interactions proposed earlier by Shelly Glashow, Abdus Salam and Steven Weinberg was a calculable theory.

The first evidence for this theory came in 1973 with the discovery of neutral currents by the Gargamelle neutrino experiment at CERN, and 1974 brought the discovery of
the charm quark, a key ingredient in what came to be known as the Standard Model. This quark had been postulated to explain properties of K mesons, whose decays are being discussed by Gaillard and Gell-Mann in this picture, and Gaillard, together with Benjamin Lee, went on to play a key role in predicting its properties. The discoveries of neutral currents and charm ushered in the Standard Model, and CERN theorists were active in exploring its implications – notably in sketching out the phenomenology of the Brout–Englert–Higgs mechanism. We worked with experimentalists particularly closely during the 1990s, making precise calculations and interpreting the results emerging from LEP that established the Standard Model.

CERN Theory in the 21st century has largely been focused on the LHC experimental programme and pursuing new ideas for physics beyond the Standard Model, often in relation to cosmology and astrophysics. These are likely to be the principal themes of theoretical research at CERN during its eighth decade.

John Ellis served as head of CERN’s Theoretical Physics Department from 1988 to 1994.

1974

Adjusting the electronics of the ion source

From 1959 to 1992, Linac1 accelerated protons to 50 MeV, for injection into the Proton Synchrotron, and from 1972 into the Proton Synchrotron Booster. In 1974, their journey started in this ion source. High voltage was used to achieve the first acceleration to a few percent of the speed of light. It wasn’t only the source itself that had to be at high voltage, but also the power supplies that feed magnets, the controllers for gas injection, the diagnostics and the controls. This platform was the laboratory for the ion source. When operational, the cubicle and everything in it was at 520 kV, meaning all external surfaces had to be smooth to avoid sparks. As pictured, hydraulic jacks could lift the lid to allow access for maintenance and testing, at which point a drawbridge would be lowered from the adjacent wall to allow the engineers and technicians to take a seat in front of the instruments.

Thanks to the invention of radio-frequency quadrupoles by Kapchinsky and Teplyakov, radio-frequency acceleration can now start from lower proton energies. Today, ion sources use much lower voltages, in the range of tens of kilovolts, allowing the source installations to shrink dramatically in size compared to the 1970s.

Richard Scrivens is CERN’s deputy head of accelerator and beam physics.

1974

The Super Proton Synchrotron tunnel

CERN’s labyrinth of tunnels has been almost contin­uously expanding since the lab was founded 70 years ago. When CERN was first conceived, who would have thought that the 7 km-long Super Proton Synchrotron tunnel shown in this photograph would have been constructed, let alone the 27 km LEP/LHC tunnel? Similar questions were raised about the feasibility of the LEP tunnel to those that are being posed today about the proposed Future Circular Collider (FCC) tunnel. But if you take a step back and look at the history of CERN’s expanding tunnel network, it seems like the next logical step for the organisation.

This vintage SPS photograph from the 1970s shows the tunnel’s secondary lining being constructed. The concrete was transported from the surface down the 50 m-deep shafts and then pumped behind the metal formwork to create the tunnel walls. This technology is still used today, most recently for the HL-LHC tunnels. However, for a mega-project like the FCC, a much quicker and more sophisticated methodology is envisaged. The tunnels would be excavated using tunnel boring machines, which will install a pre-cast concrete segmental lining using robotics immediately after the excavation of the rock, allowing 20 m of tunnel to be excavated and lined with concrete per day.

John Osborne is a senior civil engineer at CERN.

1977

Alan Jeavons and David Townsend

Detector development for fundamental physics always advances in symbiosis with detector development for societal applications. Here, Alan Jeavons (left) and David Townsend prepare the first positron-emission tomography (PET) scan of a mouse to be performed at CERN. A pair of high-density avalanche chambers (HIDACs) can be seen above and below Jeavons’ left hand. As in PET scans in hospitals today, a radioactive isotope introduced into the biological tissue of the mouse decays by emitting a positron that travels a few millimetres before annihilating with an electron. The resulting pair of coincident and back-to-back 511 keV photons was then converted into electron avalanches which were reconstructed in multiwire proportional chambers – a technology invented by CERN physicist Georges Charpak less than a decade earlier to improve upon bubble chambers and cloud chambers in high-energy physics experiments. The HIDAC detector later contributed to the development of three-dimensional PET image reconstruction. Such testing now takes place at dedicated pre-clinical facilities.

Today, PET detectors are based on inorganic scintillating crystals coupled to photodetectors – a technology that is also used in the CMS and ALICE experiments at the LHC. CERN’s Crystal Clear collaboration has been continuously developing this technology since 1991, yielding benefits for both fundamental physics and medicine.

One of the current challenges in PET is to improve time resolution in time-of-flight PET (TOF-PET) below 100 ps, and towards 10 ps. This will eventually enable positron annihilations to be pinpointed at the millimetre level, improving image quality, speeding up scans and reducing the dose injected into patients. Improvements in time resolution are also important for detectors in future high-energy experiments, and the future barrel timing layer of the CMS detector upgrade for the High-Luminosity LHC was inspired by TOF-PET R&D.

Etiennette Auffray Hillemanns is spokesperson for the Crystal Clear collaboration and technical coordinator for the CMS electromagnetic calorimeter.

1979

Rafel Carreras

In this photo, we see Rafel Carreras, a remarkable science educator and communicator, sharing his passion for science with an eager audience of young learners. Known for his creativity and enthusiasm, Carreras makes the complex world of particle physics accessible and fun. His particle-physics textbook When Energy Becomes Matter includes memorable visualisations that we still use in our education activities today. One such visualisation is the “fruity strawberry collision”, wherein two strawberries collide and transform into a multitude of new fruits, illustrating how particle collisions produce a shower of new particles that didn’t exist before.

Today, we find fewer chalk boards at CERN and more casual clothing, but one thing remains the same: CERN’s dedication to education and communication. Over the years, CERN has trained more than 10,000 science teachers, significantly impacting science education globally. CERN Science Gateway, our new education and outreach centre, allows us to welcome about 400,000 visitors annually. It offers a wide range of activities, such as interactive exhibitions, science shows, guided tours and hands-on lab experiences, making science exciting and accessible for everyone. Thanks to hundreds of passionate and motivated guides, visitors leave inspired and curious to find out more about the fascinating scientific endeavours and extraordinary technologies at CERN.

Julia Woithe coordinates educational activities at CERN’s new Science Gateway.

  • These photographs are part of a collection curated by Renilde Vanden Broeck, which will be exhibited at CERN in September.

The post Back to the future appeared first on CERN Courier.

]]>
Feature A photographic journey linking CERN’s early years to ongoing research. https://cerncourier.com/wp-content/uploads/2024/09/CCSepOct24_VINTAGE_1965.jpg
Wonderstruck wanderings https://cerncourier.com/a/wonderstruck-wanderings/ Mon, 16 Sep 2024 09:00:01 +0000 https://preview-courier.web.cern.ch/?p=111204 The wonder and awe that we sense when we look at the starry skies is a major motivation to do science. Both Plato (Theaetetus 155d) and Aristotle (Metaphysics 982b12) wrote that philosophy starts in wonder. Plato went even further to declare that the eye’s primary purpose is none other than to see and study the […]

The post Wonderstruck wanderings appeared first on CERN Courier.

]]>
An illustration of a flea from Robert Hooke’s Micrographia

The wonder and awe that we sense when we look at the starry skies is a major motivation to do science. Both Plato (Theaetetus 155d) and Aristotle (Metaphysics 982b12) wrote that philosophy starts in wonder. Plato went even further to declare that the eye’s primary purpose is none other than to see and study the stars (Timaeus 47c). But wonder and awe also play a wider role beyond science, and are fundamental to other endeavours of human civilisation, such as religion. In Wonderstruck: How Wonder and Awe Shape the Way We Think, Helen De Cruz (Saint Louis University) traces the relationship between wonder and awe and philosophy, religion, magic and science, and the development of these concepts throughout history.

Essential emotion

De Cruz’s book is rich in content, drawing from psychology, anthropology and literature. Aptly for particle physicists, she points out that it is not only the very largest scales that fill us with awe, but also the very smallest, as for example in Robert Hooke’s Micrographia, the first book to include illustrations of insects and plants as seen through a microscope. Everyday things may be sources of wonder, according to philosopher and rabbi Abraham J Heschel, who has written on religion as a response to the awe that we feel when we look at the cosmos. Even hard-nosed economists recognise the fundamental role of wonder, she observes: Adam Smith, the famous economist who wrote The Wealth of Nations, believed that wonder is an essential emotion that underlies the pursuit of science, as it prompts people to explore the unknown and seek knowledge about the world. Although particle physics is not mentioned explicitly in the book – the closest instance is a quote from Feynman’s Lectures on Physics – the implications are clear. And while the sources quoted are mostly Western, other traditions are not ignored, with references to Chinese and Japanese culture present, among others.

Wonderstruck

The book also motivates questions that it does not address, some of which are especially interesting for funda­mental physics. For example, modern human beings who live and work in cities spend most of their lives in an environment that alienates them from nature, and nature-induced awe must compete with technology-driven amazement. One can maybe glimpse that in outreach, where curiosity about technology sometimes, though not always, eclipses interest about the fundamental questions of science. While the book discusses this topic in the context of climate change – a reality that reminds us that we cannot ignore nature – there is more one can do with respect to the effects of such an attitude in motivating fundamental science.

At a time when large scientific projects, such as CERN’s proposed Future Circular Collider, are being considered, generating a lot of discussions about cost and benefit, this book reminds us that the major motivation of a new telescope or collider is to push into the frontiers of the unknown – a process that starts and finishes with wonder and awe. As such, the book is very useful reading for scientists doing fundamental research, especially those who engage with the public.

The post Wonderstruck wanderings appeared first on CERN Courier.

]]>
Review https://cerncourier.com/wp-content/uploads/2024/09/CCSepOct24_REV_wonder_feature.jpg
Advances in cosmology https://cerncourier.com/a/advances-in-cosmology/ Mon, 15 Apr 2024 12:28:24 +0000 https://preview-courier.web.cern.ch/?p=110469 The papers assembled in this volume range in subject matter from dark-matter searches and gravitational waves to artistic and philosophical considerations.

The post Advances in cosmology appeared first on CERN Courier.

]]>
Advances in cosmology

On the 30th anniversary of the discovery of weak neutral currents, the architects of the Standard Model of strong and electroweak interactions met in the CERN main auditorium on 16 September 2003 to debate the future of high-energy physics. During the panel discussion, Steven Weinberg repeatedly propounded the idea that cosmology is part of the future of high-energy physics, since cosmology “is now a science” as opposed to a mere theoretical framework characterised by diverging schools of thought. Twenty years later, this viewpoint may serve as a summary of the collection of articles in Advances in Cosmology.

The papers assembled in this volume encompass the themes that are today associated with the broad domain of cosmology. After a swift theoretical section, the contributions range from dark-matter searches (both at the LHC and in space) to gravitational waves and optical astronomy. The last two sections even explore the boundaries between cosmology, philosophy and artistic intuition. Indeed, as former CERN Director-General Rolf Heuer correctly puts it in his thoughtful foreword, the birth of quantum mechanics was also a philosophical enterprise: both Wolfgang Pauli and Werner Heisenberg never denied their Platonic inspiration and reading Timaeus (the famous Plato dialogue dealing with the origin and purpose of the universe) was essential for physicists of that generation to develop their notion of symmetry (see, for instance, Heisenberg’s 1969 book Physics and Beyond).

In around 370 pages, the editors of Advances in Cosmology manage to squeeze in more than two millennia of developments ranging from Pythagoras to the LHC, and for this reason the various contributions clearly follow different registers. Interested readers will not only find specific technical accounts but also the wisdom of science communicators and even artists. This is why the complementary parts of the monograph share the same common goals, even if they are not part of the same logical line of thinking.

Advances in Cosmology appeals to those who cherish an inclusive and eclectic approach to cosmology and, more generally, to modern science. While in the mid 1930s Edwin Hubble qualified the frontier of astronomy as the “realm of the nebulae”, modern cosmology combines the microscopic phenomena of quantum mechanics with the macroscopic effects of general relativity. As this monograph concretely demonstrates, the boundaries between particle phenomenology and the universe’s sciences are progressively fading away. Will the next 20 years witness only major theoretical and experimental breakthroughs, or more radical changes of paradigm? From the diverse contributions collected in this book, we could say, a posteriori, that scientific revolutions are never isolated as they need environmental selection rules that come from cultural, technological and even religious boundary conditions that cannot be artificially manufactured. This is why paradigm shifts are often difficult to predict and only recognised well after their appearance.

The post Advances in cosmology appeared first on CERN Courier.

]]>
Review The papers assembled in this volume range in subject matter from dark-matter searches and gravitational waves to artistic and philosophical considerations. https://cerncourier.com/wp-content/uploads/2024/04/CCMarApr24_REV_Advances_featured.jpg
Extremely Brilliant Source illuminates Paganini’s favourite violin https://cerncourier.com/a/extremely-brilliant-source-illuminates-paganinis-favourite-violin/ Thu, 21 Mar 2024 10:07:06 +0000 https://preview-courier.web.cern.ch/?p=110236 Intense beams of synchrotron X-rays have revealed the inner workings of Niccolò Paganini’s favourite violin.

The post Extremely Brilliant Source illuminates Paganini’s favourite violin appeared first on CERN Courier.

]]>
Intense beams of synchrotron X-rays produced at the European Synchrotron Radiation Facility (ESRF) in Grenoble have revealed the inner workings of Niccolò Paganini’s favourite violin. Renowned for its acoustic prowess, the 280 year-old “Il Cannone” ranks among the most important instruments in the history of Western music. To help understand and preserve the precious artefact, the Municipality of Genoa in Italy and the Premio Paganini teamed up with researchers at the ESRF’s new BM18 beam line to study the structural status of the wood and its bonding.

Using multi-resolution propagation phase-contrast X-ray microtomography, a non-destructive technique widely used at the ESRF for palaeontology, the team was able to reconstruct a 3D image of the violin at the level of its cellular structure. In addition to revealing Il Cannone’s conservation status and structure, the results hint at the interventions made by luthiers throughout the instrument’s life.

In few months, we will be able to work on much larger instruments, up to the size of a double bass

Paul Tafforeau, ESRF

Inaugurated in 1994, the ESRF was the first “third generation” synchrotron, using periodic magnetic arrays called undulators to deliver the world’s brightest X-ray beams. It consists of a 844 m-circumference 6 GeV electron storage ring with almost 50 experimental stations serving around 5000 users per year across a wide range of disciplines. The study of Paganini’s violin was made possible by an EUR 330 million upgrade called the Extremely Brilliant Source, which came online in 2020. With an increased X-ray brightness and coherent flux 100 times higher than before, the facility allows complex materials to be imaged more quickly and in greater detail.

“We had to deal with some logistical and technical challenges, but the ESRF team did an incredible job to make this dream a reality,” says Paul Tafforeau, ESRF scientist in charge of BM18. “I hope that this experiment will be the first in a long series. In few months, we will be able to work on much larger instruments, up to the size of a double bass.”

The post Extremely Brilliant Source illuminates Paganini’s favourite violin appeared first on CERN Courier.

]]>
News Intense beams of synchrotron X-rays have revealed the inner workings of Niccolò Paganini’s favourite violin. https://cerncourier.com/wp-content/uploads/2024/03/Guarnieri-del-Gesu-Il-Cannone-di-Niccolo-Paganini_@Alessi.png
A year of celebrations https://cerncourier.com/a/a-year-of-celebrations/ Wed, 28 Feb 2024 15:32:40 +0000 https://preview-courier.web.cern.ch/?p=109888 2024 CERN turns 70 and celebrates all year round with events commemorating CERN's different contributions to science and society.

The post A year of celebrations appeared first on CERN Courier.

]]>
2024 marks CERN’s 70th anniversary, with a packed programme of events that connect CERN’s heritage with its exciting future. This will culminate in a grand celebration for the CERN community on 17 September followed by a high-level official ceremony on 1 October. Kicking off proceedings in CERN Science Gateway on 30 January is a public event exploring CERN’s science. Two events on 7 March and 18 April will showcase how innovation and technologies in high-energy physics have found applications in daily life and medicine, while the transformative potential of global collaboration is the topic of a fourth public event in mid-May. Events in June and July will focus on open questions in the field and on the future facilities needed to address them, and public events will also be organised in CERN’s member states and beyond. The full programme can be found at: cern70.web.cern.ch/.

The post A year of celebrations appeared first on CERN Courier.

]]>
News 2024 CERN turns 70 and celebrates all year round with events commemorating CERN's different contributions to science and society. https://cerncourier.com/wp-content/uploads/2024/01/CCJanFeb24_NA_CERN@70logo2.jpg
Bite-sized travels in particle physics https://cerncourier.com/a/bite-sized-travels-in-particle-physics/ Tue, 23 Jan 2024 10:33:13 +0000 https://preview-courier.web.cern.ch/?p=110025 A unique and clever structure makes this collection of short stories stand out.

The post Bite-sized travels in particle physics appeared first on CERN Courier.

]]>
Faszinierende Teilchenphysik is certainly not the first popular book about elementary particle physics, and it won’t be the last. But its unique and clever structure make it stand out.

Think of it as a collection of short stories, organised in 12 chapters covering all ground from underlying theories and technologies to the limits of the Standard Model and ideas beyond it. The book begins with a gentle introduction to the world of particles and finishes by linking the infinitely small to the infinitely large. Each double-page spread within these chapters features a different topic in particle physics, its players, rules of play, tools, concepts and mysteries. Turn a page, and you find a new topic.

Among these 150 spreads, which are referred to as “articles” by the diverse team of authors, the reader can learn about neutrinos, lattice QCD, plasma acceleration, Feynman diagrams, multi-messenger astronomy and much more. Each one manages to convey both the fascination of the subject as well as all the central ideas and open questions within the two allocated pages. This makes for a great way of reading: the article about antimatter, for example, cross-references to the article about baryogenesis, so flip from page 18 to page 304 to dig deeper into the antimatter mystery. Not sure what a baryon is? Check the glossary, then maybe jump on the article about matter and antimatter, CP violation or symmetries. There is no need to read this book from cover to cover. On the contrary, browsing is so deeply embedded in its concept that it even features a flip-book illustration of a particle collision on the bottom right-hand-side of each spread. With a bit more care for captivating illustrations and graphic design, it could pass as a Dorling Kindersley-style travel guide to particle physics.

Faszinierende Teilchenphysik

The authors, who are based at different universities and labs in Germany, have backgrounds covering theoretical and experimental particle physics, astro­particle physics, accelerator and nuclear physics, and science communication.  They have obviously put as much thought into this publication as they put in hours, because they manage to write about each topic in a way that is easy to follow, even if it’s hard to digest. Puns, comparisons to everyday life and drawings to accompany the articles make for a full browsing experience, and the references within the text and at the bottom of each page show how everything is connected deep down.

When I received Faszinierende Teilchenphysik for review, one of the authors jokingly accompanied it with the words “this book is meant for retired engineers and for aunts looking for a present for their science-student-to-be nieces.” That may well be the case, but this book’s target audience is much wider. Physics fans and amateurs will enjoy sinking their teeth into a new world of interlinked topics; undergraduates will value it as a quick reference source that is less obscure and more fun to read than Wikipedia; and physics professionals will find it a useful refresher for topics beyond their expertise. The book even dedicates its final article to those questioning whether it is worth spending money and brain power on tiny particles, ending with a passionate case for the many benefits of fundamental research – not just spin-offs such as tumour therapy or artificial intelligence, but in pushing boundaries of knowledge outward.

And if you’re afraid that your school German might let you down, don’t worry: the English edition is already in the works and due to come out in 2024.

The post Bite-sized travels in particle physics appeared first on CERN Courier.

]]>
Opinion A unique and clever structure makes this collection of short stories stand out. https://cerncourier.com/wp-content/uploads/2024/01/CCJanFeb24_REV_atlasevent.jpg
New CERNs for a fractured world https://cerncourier.com/a/new-cerns-for-a-fractured-world/ Wed, 17 Jan 2024 10:01:19 +0000 https://preview-courier.web.cern.ch/?p=109990 The resurgence of nationalism along with pressing global challenges call for stronger scientific communities, argue Leonard Lynn and Hal Salzman.

The post New CERNs for a fractured world appeared first on CERN Courier.

]]>
Although a brief period of hubris and short-sightedness at the end of the Cold War led some in the West to proclaim “the end of history” and a path to a unified global community, underlying and historically ever-present geopolitical tensions have surfaced again, perhaps as strongly as in the past. At the same time, the past decades have witnessed increased education of talented scientists and technologists across the globe, including in low- and middle-income countries that were once outside the leading science communities. To address the science and technology challenges of our time, we need to find ways to steady the ship to best navigate this changing global scene.

Just as CERN was born out of the ashes of global destruction and disarray – a condition that called for collaboration out of necessity – we propose that the resurgence of nationalism along with pressing challenges such as climate change, disease and artificial intelligence call for stronger scientific communities. At the time of CERN’s founding 70 years ago, European physicists, especially in sub-atomic physics, faced marginalisation. Devastated European countries could not separately fund the “big science” facilities necessary to do cutting-edge research. Moreover, physicists were divided by national loyalties to countries that had been enemies during the war. In the period that followed, it seemed that subatomic research would be dominated by the US and the USSR. Worse, it seemed all too likely that the nationalistic agendas in those nations would push for advances in catastrophic new military technologies.

Leonard Lynn

The creation and operation of CERN in that environment was monumental. CERN brought together scientists from various countries, eventually extending beyond Europe. It greatly advanced basic knowledge in fundamental physics and spun-off practical technologies such as the web and medical equipment. It has also served as a template (greatly underused in our view) for other international science and technology organisations such as SESAME in the Middle East. Today, the challenges for global cooperation in science and technology are different from those facing the founders of CERN. Mostly Western Europeans, with a few US supporters, they shared the discipline of subatomic physics and included Nobel Laureates and other highly respected people who were able to enlist the help of supportive diplomats in the various founding states.

Moment for change

The current geopolitical moment calls forth the need for more CERN-like organisations, just as occurred in that brief post-war moment. New global institutes and organisations to address global problems will have to span a broad range of countries and cultures. They will have to overcome techno-nationalistic opportunism and fears, and deal with potential capture by multinational enterprises (as happened with the response to COVID).

New global institutes and organisations to address global problems will have to span a broad range of countries and cultures

Since its founding, CERN has increasingly shown the ability to cross cultural and political boundaries – most nations of the world have sent scientists to participate in CERN projects, and non-European countries such as India, Pakistan and Turkey are associate members. Some mention the importance of facility cafeterias and other venues where scientists from different countries can meet and have unofficial discussions. CERN has striven to keep decision-making separate from national interests by having a convention that precludes its involvement in military technologies, and by having decisions about projects made primarily by scientists. It has strong policies regarding the sharing of intellectual property developed at its facilities.

Hal Salzman

CERN’s contributions to basic science and to various important technologies is undisputed. We suggest its potential contributions to the organisation of global science and technology cooperation also deserve greater attention. A systematic examination of CERN’s governance system and membership should be undertaken and compared with the experiences of others. Analysing how the CERN model fits social science-studies of design principles, it is clear that the CERN success brings important additional principles for when the common-pool resources are science and technology, and members come from diverse cultural backgrounds. CERN has addressed issues of bringing together scientists from countries that may have competing techno-nationalistic agendas, providing shelter against not only government but also multinational enterprises. It has focused on non-military technologies and on sharing its intellectual property. It is time that this organisational experience is rolled out for even greater common good.

The post New CERNs for a fractured world appeared first on CERN Courier.

]]>
Opinion The resurgence of nationalism along with pressing global challenges call for stronger scientific communities, argue Leonard Lynn and Hal Salzman. https://cerncourier.com/wp-content/uploads/2024/01/CCJanFeb24_VIEW_CERN.jpg
Electroweak milestones at CERN https://cerncourier.com/a/electroweak-milestones-at-cern/ Thu, 23 Nov 2023 17:12:47 +0000 https://preview-courier.web.cern.ch/?p=109779 A memorable scientific symposium in the new CERN Science Gateway on 31 October brought the past, present and future of electroweak exploration into vivid focus.

The post Electroweak milestones at CERN appeared first on CERN Courier.

]]>
Celebrating the 1973 discovery of weak neutral currents by the Gargamelle experiment and the 1983 discoveries of the W and Z bosons by the UA1 and UA2 experiments at the SppS, a highly memorable scientific symposium in the new CERN Science Gateway on 31 October brought the past, present and future of electroweak exploration into vivid focus. “Weak neutral currents were the foundation, the W and Z bosons the pillars, and the Higgs boson the crown of the 50 year-long journey that paved the electroweak way,” said former Gargamelle member Dieter Haidt (DESY) in his opening presentation.

History could have turned out differently, said Haidt, since both CERN and Brookhaven National Laboratory (BNL) were competing in the new era of high-energy neutrino physics: “The CERN beam was a flop initially, allowing BNL to snatch the muon-neutrino discovery in 1962, but a second attempt at CERN was better.” This led André Lagarrigue to dream of a giant bubble chamber, Gargamelle, financed and built by French institutes and operated by CERN with beams from the Proton Synchrotron (PS) from 1970 to 1976. Picking out the neutral-current signal from the neutron-cascade background was a major challenge, and a solution seemed hopeless until Haidt and his collaborators made a breakthrough regarding the meson component of the cascade.

The ten years between the discovery of neutral currents and the W and Z bosons are what took CERN from competent mediocrity to world leader

Lyn Evans

By early July 1973, it was realised that Gargamelle had seen a new effect. Paul Musset presented the results in the CERN auditorium on 19 July, yet by that autumn Gargamelle was “treated with derision” due to conflicting results from a competitor experiment in the US. ‘The Gargamelle claim is the worst thing to happen to CERN,’ Director-General John Adams was said to have remarked. Jack Steinberger even wagered his cellar that it was wrong. Following further cross checks by bombarding the detector with protons, the Gargamelle result stood firm. At the end of Haidt’s presentation, collaboration members who were present in the audience were recognised with a warm round of applause.

From the PS to the SPS
The neutral-current discovery and the subsequent Gargamelle measurement of the weak mixing angle made it clear not only that the electroweak theory was right but that the W and Z were within reach of the technology of the day. Moving from the PS to the SPS, Jean-Pierre Revol (Yonsei University) took the audience to the UA1 and UA2 experiments ten years later. Again, history could have taken a different turn. While CERN was working towards a e+e collider to find the W and Z, said Revol, Carlo Rubbia proposed the radically different concept of a hadron collider — first to Fermilab, which, luckily for CERN, declined. All the ingredients were presented by Rubbia, Peter McIntyre and David Cline in 1976; the UA1 detector was proposed in 1978 and a second detector, UA2, was proposed by CERN six months later. UA1 was huge by the standards of the day, said Revol. “I was advised not to join, as there were too many people! It was a truly innovative project: the largest wire chamber ever built, with 4π coverage. The central tracker, which allowed online event displays, made UA1 the crucial stepping stone from bubble chambers to modern electronic ones. The DAQ was also revolutionary. It was the beginning of computer clusters, with same power as IBM mainframes.”

First SppS collisions took place on 10 July 1981, and by mid-January 1983 ten candidate W events had been spotted by the two experiments. The W discovery was officially announced at CERN on 25 January 1983. The search for the Z then started to ramp up, with the UA1 team monitoring the “express line” event display around the clock. On 30 April, Marie Noelle Minard called Revol to say she had seen the first Z. Rubbia announced the result at a seminar on 27 May, and UA2 confirmed the discovery on 7 June. “The SppS was a most unlikely project but was a game changer,” said Haidt. “It gave CERN tremendous recognition and paved the way for future collaborations, at LEP then LHC.”

Former UA2 member Pierre Darriulat (Vietnam National Space Centre) concurred: “It was not clear at all at that time if the collider would work, but the machine worked better than expected and the detectors better than we could dream of.” He also spoke powerfully about the competition between UA1 and UA2: “We were happy, but it was spoiled in a way because there was all this talk of who would be ‘first’ to discover. It was so childish, so ridiculous, so unscientific. Our competition with UA1 was intense, but friendly and somewhat fun. We were deeply conscious of our debt toward Carlo and Simon [van der Meer], so we shared their joy when they were awarded the Nobel prize two years later.” Darriulat emphasised the major role of the Intersecting Storage Rings and the input of theorists such as John Ellis and Mary K Gaillard, reserving particular praise for Rubbia. “Carlo did the hard work. We joined at the last moment. We regarded him as the King, even if we were not all in his court, and we enjoyed the rare times when we saw the King naked!”

Our competition with UA1 was intense, but friendly and somewhat fun

Pierre Darriulat

The ten years between the discovery of neutral currents and the W and Z bosons are what took CERN “from competent mediocrity to world leader”, said Lyn Evans in his account of the SppS feat. Simon van der Meer deserved special recognition, not just for his 1972 paper on stochastic cooling, but also his earlier invention of the magnetic horn, which was pivotal in increasing the neutrino flux in Gargamelle. Evans explained the crucial roles of the Initial Cooling Experiment and the Antiproton Accumulator, and the many modifications needed to turn the SPS into a proton-antiproton collider. “All of this knowledge was put into the LHC, which worked from the beginning extremely well and continues to do so. One example was intrabeam scattering. Understanding this is what gives us the very long beam lifetimes at the LHC.”

Long journey
The electroweak adventure began long before CERN existed, pointed out Wolfgang Hollik, with 2023 also marking the 90th anniversary of Fermi’s four-fermion model. The incorporation of parity violation came in 1957 and the theory itself was constructed in the 1960s by Glashow, Salam, Weinberg and others. But it wasn’t until ‘t Hooft and Veltman showed that the theory is renormalizable in the early 1970s that it became a fully-fledged quantum field theory. This opened the door to precision electroweak physics and the ability to search for new particles, in particular the top quark and Higgs boson, that were not directly accessible to experiments. Electroweak theory also drove a new approach in theoretical particle physics based around working groups and common codes, noted Hollik.

The afternoon session of the symposium took participants deep into the myriad of electroweak measurements at LEP and SLD (Guy Wilkinson, University of Oxford), Tevatron and HERA (Bo Jayatilaka, Fermilab), and finally the LHC (Maarten Boonekamp, Université Paris-Saclay and Elisabetta Manca, UCLA). The challenges of such measurements at a hadron collider, especially of the W-boson mass, were emphasised, as were their synergies with QCD in measurements in improving the precision of parton distribution functions.

The electroweak journey is far from over, however, with the Higgs boson offering the newest exploration tool. Rounding off a day of excellent presentations and personal reflections, Rebeca Gonzalez Suarez (Uppsala University) imagined a symposium 40 years from now when the proposed collider FCC-ee at CERN has been operating for 16 years and physicists have reconstructed nearly 1013 W and Z bosons. Such a machine would take the precision of electroweak physics into the keV realm and translate to a factor of seven increase in energy scale. “All of this brings exciting challenges: accelerator R&D, machine-detector interface, detector design, software development, theory calculations,” she said. “If we want to make it happen, now is the time to join and contribute!”

The post Electroweak milestones at CERN appeared first on CERN Courier.

]]>
Meeting report A memorable scientific symposium in the new CERN Science Gateway on 31 October brought the past, present and future of electroweak exploration into vivid focus. https://cerncourier.com/wp-content/uploads/2023/11/fabiola_symp_online.jpg
The power of objects https://cerncourier.com/a/the-power-of-objects/ Fri, 03 Nov 2023 12:23:31 +0000 https://preview-courier.web.cern.ch/?p=109535 Science centres impress with all kinds of high-tech exhibits, but often it is a simple object or piece of an experiment that holds the most fascinating stories.

The post The power of objects appeared first on CERN Courier.

]]>
The bottle Peter Higgs drank from

One of the key challenges of communicating particle physics – particularly when preserving and presenting tangible artefacts – is the sheer scale of the endeavour. The infrastructure of particle physics has frequently been likened to cathedrals: great vaulted caverns built by the hands of many in search of truths about our universe. And even the major facilities are only one part in the international network of particle physics. Museums, which are also often likened to cathedrals, weren’t typically designed with gigantic and internationally distributed artefacts in mind. And that’s before we consider the objects of study: you can’t display a particle in a glass case. So how can we find tangible ways to represent abstract physical phenomena? What does it mean to represent the work of the thousands of people involved in today’s particle-physics projects? And is it possible to capture a fleeting moment of discovery for posterity?

Sometimes, those fleeting moments are best captured by ephemeral objects. Something that might seem mundane or throwaway can provide eloquent insights into the real life of physics. The announcement of the discovery of the Higgs Boson at CERN on 4 July 2012 was recorded in several formats, notably the film footage of Peter Higgs wiping away a tear in CERN’s main auditorium as the ATLAS and CMS teams announced the discovery of the particle whose existence he, François Englert and Robert Brout had predicted decades before.

A material memorial of the Higgs-boson discovery is the champagne bottle emptied by Higgs and John Ellis the night before the announcement. In fact, the quiet and modest Higgs usually prefers London Pride beer; unfortunately, the can that he drank on his flight home from Geneva after the announcement was not saved for posterity. But the champagne bottle also speaks to a common practice at CERN: around the site, particularly in the CERN Control Centre, there are arrays of empty bottles, opened in celebration of events including the LHC start-up, first physics collisions, major publications and other milestones.

Familiar yet unexpected objects such as the champagne bottle in the context of a display about physics can pique visitors’ interest and encourage them to move on to more complex-looking displays that they might otherwise pass by. As such, Higgs’ champagne bottle featured in the Collider exhibition (see above picture) produced by the London Science Museum in 2013 and another bottle is part of CERN’s heritage collection – a curated assortment of more than 200 objects that encapsulate CERN’s history.

Magic moments

Connecting to newsworthy moments or well-known people is usually a successful draw for visitors. Capitalising on the global success of the movie Oppenheimer, this year the Bradbury Science Museum at Los Alamos developed  an exhibition of Oppenheimer-related artefacts, including his own copy of the Bhagavad Gita. At CERN Science Gateway, Tim Berners-Lee’s NeXT computer – used to host the first website – creates an immediate talking point for visitors who can barely imagine life without the web, despite the object itself being literally a black box.

Tim Berners-Lee’s NeXT machine

That said, it is rare for a scientific or technological artefact to be a “show piece” that would attract visitors in its own right, in the same way that they would queue to see a famous artwork. The Antikythera mechanism at the National Archaeological Museum in Athens, Galileo’s telescopes at the Museo Galileo in Florence, or the Apollo 11 command module at the National Air and Space Museum in Washington are not representative of the types of material generally found in science heritage collections. Most science-related objects are not that easy for non-specialists to engage with; to the uninitiated eye the tools of particle physics mostly look like wiring and plumbing. Exhibition developers therefore usually adopt the “key pieces” approach advocated by Dutch curator Ad Maas: setting objects in the context of an overall narrative and a rich array of materials including photographs, documents, film, audio and personal testimony brings them to life and allows developers to layer information for different audience tastes and interest levels. Thanks to CERN’s archives and heritage collection, there is a wide range of material to draw from.

Using the key-pieces approach, a single small part can be revealing of a much larger whole: for example, by following the “life story” of a lead tungstate crystal used in the CMS electromagnetic calorimeter – which was also featured at the Collider exhibition – we gain insights into the decades-long design and planning process for the CMS detector, and an adventure in production and testing that takes us on a journey via Moscow, Shanghai and Rome (with a detour to the UBS bank vaults in Zurich). The physical nature of the object itself reflects its design and production history, while also illustrating the phenomena of particle decay and scintillation. At CERN, you’ll find displays of crystals like these around the site, in public and private spaces.

Much of the scientific heritage of the 20th and 21st centuries was originally preserved by practitioners with a sixth sense of “this might be useful someday” rather than by professional curators. Today, CERN has detailed archival and heritage collection policies that offer guidance as to what kinds of material might be worth keeping for posterity. Of course it’s impossible to keep everything; we can’t predict for certain what avenues future historians might be interested in exploring, or what kinds of objects will be used to popularise science. But by preserving storehouses of memories, we might be keeping some building blocks for the cathedrals of a future age.

The post The power of objects appeared first on CERN Courier.

]]>
Feature Science centres impress with all kinds of high-tech exhibits, but often it is a simple object or piece of an experiment that holds the most fascinating stories. https://cerncourier.com/wp-content/uploads/2023/10/CCNovDec23_POWER_computer.jpg
Going where the crowd is https://cerncourier.com/a/going-where-the-crowd-is/ Fri, 03 Nov 2023 12:19:43 +0000 https://preview-courier.web.cern.ch/?p=109531 Based on the success of CERN’s first Science Pavilion at the WOMAD music festival in 2016, the project has grown to become a highly successful outreach effort known as the CERN Festival Programme.

The post Going where the crowd is appeared first on CERN Courier.

]]>
Summer means holidays, beaches, long evenings outside and, for many, attending an outdoor festival. Music festivals in particular have expanded all over the world, and the competition to offer new experiences to curious festival goers has created opportunities to share CERN’s work and science with this untapped audience, many of whom never normally go to science events. Based on the success of CERN’s first Science Pavilion at Peter Gabriel’s world music festival WOMAD in 2016, the project has grown to become a highly successful outreach effort known as the CERN Festival Programme. The generally three-day programme offers a variety of shows, presentations, talks and hands-on workshops tailored to each country and demographic. The Pavilions are a real collaboration, a partnership between CERN, collaborating institutes in each country and the festival itself, each sharing costs and person power.

In 2019, four Science Pavilions were held in four different music and culture festivals in four different countries: the Big Bang Stage at the Ostrava festival in the Czech Republic, produced in partnership with Charles University and the Czech Technical University; the Magical Science Pavilion at the Pohoda Festival in Slovakia – an incredible space produced with Comenius University; the World of Physics at WOMAD in the UK, going strong year after year thanks to an enduring collaboration with Roger Jones of Lancaster University; and the Science Pavilion at the Roskilde Festival in Denmark, a highly successful relationship with Jørgen Beck Hansen at the Niels Bohr Institute in Copenhagen. More than 20,000 people came to the four spaces in 2019!

Workshops give people a chance to interact in a direct way with science and technology, as well as with physicists working on different experiments at CERN. They often can’t believe that these people who work for CERN have actually come to the festival to talk to them. A variety of topics are covered ranging from what’s new in physics to technological and scientific advances in the news that touch on people’s everyday lives, such as artificial intelligence. For 2023 we introduced a successful “scientific speed dating” with the young audience at Roskilde. A talk from CERN’s director for accelerators and technology Mike Lamont on physics and medicine and an informal “Chat with the AI experts” in the sunshine also proved incredibly popular at WOMAD this year. Between 4000 and 6000 people come to each Science Pavilion in each festival every year. Requests for new collaborations in other countries are coming in, and as a result there are currently ongoing discussions for Pavilions at festivals in the Netherlands and Spain. Physicists love the idea and their students are always an important asset to the event, with the most forward-thinking institutes keen to be part of the programme.

The feedback from visitors is clear: people love finding science at a music festival. The fact that the science is taken to them, where they are at their most comfortable, relaxed and receptive to new things, is key to the programme’s success. Comments range from “It’s a welcome break to sit in a cool space and learn something interesting and talk about stuff other than drinking and partying” to “I never liked science at school, I found it so boring and complicated, but here you make it fun and I’ve come back every year, I love it!”.

Recently, the Festival Programme was approved to be part of the CERN and Society Foundation. This means that an individual wishing to support this fantastic form of outreach and communication, or a company that understands the benefit of the programme and would like to have their logo at the festival next to ours, can now do so. It’s a great opportunity to reach new audiences, and especially to engage in those countries whose people are actually funding CERN.

The post Going where the crowd is appeared first on CERN Courier.

]]>
Opinion Based on the success of CERN’s first Science Pavilion at the WOMAD music festival in 2016, the project has grown to become a highly successful outreach effort known as the CERN Festival Programme. https://cerncourier.com/wp-content/uploads/2023/10/CCNovDec23_MUSIC_SciPav.jpg
A frog among birds https://cerncourier.com/a/a-frog-among-birds/ Thu, 24 Aug 2023 08:35:49 +0000 https://preview-courier.web.cern.ch/?p=109108 The book “Well, Doc, You’re In” is a fascinating glimpse within Dyson's vast and diverse legacy.

The post A frog among birds appeared first on CERN Courier.

]]>
“Well, Doc, You’re In”: Freeman Dyson’s Journey through the Universe is a biographical account of an epochal theoretical physicist with a mind that was, by any measure, delightful and diverse. It portrays Dyson, a self-described frog among birds, as a one-off synthesis of blitz-spirit Britishness with American space-age can-do. Of the elite cadre of theoretical physicists who ushered in the era of quantum field theory, which dominates theoretical physics to this day, who else would have devoted so much time and sincere scientific energy to the development of a gargantuan spacecraft, powered by nuclear bombs periodically dropped beneath it, that would take human civilisation beyond our solar system!

Written by colleagues, friends, family members and selected experts, each chapter is more of a self-contained monograph, a link in a chain, than it is a portion of the continuous thread that one would find for a more traditional single-author biography. What is lost as a result of this format, such as an occasional repetition of key life moments, is more than sufficiently compensated by richness of perspective and a certain ease of pick-up put-down that comes from the narrational independence of the various chapters. If it has been a while since the reader last had a moment to pick it up, not much will be lost when one delves back in.

The early years of Dyson-caliber 20th-century theoretical physicists and mathematicians of his cohort are often interwoven with events surrounding the development of nuclear weapons or codebreaking. Dyson’s story as told in “Well, Doc, You’re In” stands apart in this respect, as he spent the war years working in Bomber Command for the Royal Air Force in England. His reflections on aspects of his own experience mirror, in some ways, the sentiments of future colleagues involved in the Manhattan project, noting: “Through science and technology, evil is organised bureaucratically so that no individual is responsible for what happens.”

“Well, Doc, You’re In”: Freeman Dyson’s Journey through the Universe

The following years spent wrestling with quantum electrodynamics (QED) at Cornell make for lighter reading. The scattered remarks from eminent theorists such as Bethe and Oppenheimer on Dyson and his work, as well as from Dyson on his eminent colleagues, bring a sense of reality to the unfolding developments that would ultimately become a momentous leap forward in the understanding of quantum field theory.

“The preservation and fostering of diversity is the great goal that I would like to see embodied in our ethical principles and in our political actions,” said Dyson. Following his deep contributions to QED, Dyson embraced this spirit of diversity and jumped from scientific pond to pond in search of progress, be it the stability of matter or the properties of random matrices. It is interesting to learn, with hindsight, of the questions that gripped Dyson’s imagination at a time when particle physics was entering a golden era. As a reader one almost feels the contrarian spirit, or rebellion, in these choices as they are laid out against this backdrop.

Although scientifically Dyson may have been a frog, jumping from pond to pond, professionally he was anything but. Aged 29 he moved to the Institute for Advanced Study at Princeton and he stayed there to the end. In around 1960 Dyson joined the JASON defence advisory group, a group of scientists advising the US government on scientific matters. He remained a member until his passing in 2020. This consistent backdrop makes for a biographical story, which is essentially free from the distractions of the professional manoeuvring that typically punctuates biographies of great scientists. A positive consequence is that the various authors, and the reader, may focus that bit more keenly on the workings of Dyson’s mind.

For as long as graduate students learn quantum field theory, they will encounter Dyson. Sci-fi fans will recognise the Dyson Sphere (a structure surrounding a star to allow advanced civilisations to harvest more energy) featured in Star Trek, or note the name of the Orion III Spaceplane in 2001: A Space Odyssey. Dyson’s legacy is as vast and diverse as the world his mind explored and “Well, Doc, You’re In” is a fascinating glimpse within.

The post A frog among birds appeared first on CERN Courier.

]]>
Review The book “Well, Doc, You’re In” is a fascinating glimpse within Dyson's vast and diverse legacy. https://cerncourier.com/wp-content/uploads/2023/08/CCSepOct23_REV_dyson.jpg
A carnival of ideas in Kolkata https://cerncourier.com/a/a-carnival-of-ideas-in-kolkata/ Wed, 05 Jul 2023 10:05:02 +0000 https://preview-courier.web.cern.ch/?p=108837 The MMAP 2020 conference covered a mixture of low- to high-energy physics on the one hand and the cosmology of the creation of the universe on the other.

The post A carnival of ideas in Kolkata appeared first on CERN Courier.

]]>
A one-of-a-kind conference MMAP (Macrocosmos, Microcosmos, Accelerator and Philosophy) 2020 was held in May last year in Kolkata, India, attracting 200 participants in person and remotely. An unusual format for an international conference, it combined the voyage from the microcosmos of elementary particles to the macrocosmos of our universe up to the horizon and beyond with accelerator physics and philosophy through the medium of poetry and songs, as inspired by the Indian poet Rabindranath Tagore and the creative giant Satyajit Ray. 

The first presentation was by Roger Penrose, who talked about black holes, singularities and conformal cyclic cosmology. He discussed the cosmology of dark matter and dark energy, and inspired participants with the fascinating idea of one aeon going over to another aeon endlessly with no beginning or end of time and space.

Larry McLerran’s talk “Quarkyonic matter and neutron stars” provided an intuitive understanding of the origin of the equation of state of neutron stars at very high density, followed by Debadesh Bandyopadhyay’s talk on unlocking the mysteries of neutron stars. Jean-Paul Blaziot talked about the emergence of hydrodynamics in expanding quark–gluon plasma, whereas Edward Shuryak discussed the role of sphaleron explosions and baryogengesis in the cosmological electroweak phase transition. Subir Sarkar’s talk “Testing the cosmological principle” was provocative, as usual, and Sunil Mukhi and Aninda Sinha described the prospects for string theory. Sumit Som, Chandana Bhattacharya, Nabanita Naskar and Arup Bandyopadhyay discussed the low- and medium-energy physics possible using cyclotrons at Kolkata.

Moving to extreme nuclear matter, Barbara Jacak talked about experimental studies of transport in dense gluon matter. Jurgen Schukraft, Federico Antinori, Tapan Nayak, Bedangadas Mohanty and Subhasis Chattopadhyay spoke on signatures for the early-universe quark-gluon plasma and described the experimental programme of the ALICE experiment at the LHC, and Dinesh Srivastava focussed on the electromagnetic signatures of quark-gluon plasma.

A carnival of ideas, a mixture of low- to high-energy physics on the one hand and the cosmology of the creation of the universe on the other

Amanda Cooper-Sarkar emphasised the role of parton distribution functions in searches for new physics at colliders such as the LHC. Shoji Nagamiya presented the physics prospects of the J-PARC facility in Japan, Paolo Giubellino described the evolution of the latest FAIR accelerator at GSI, and Horst Stöcker discussed how to observe strangelets using fluctuation tools. In his presentation on the history of CERN, former Director-General Rolf Heuer talked about the marvels of large-scale collaboration capturing the thrill of a big discovery.

The MMAP 2020 conference witnessed a carnival of ideas, a mixture of low- to high-energy physics on the one hand and the cosmology of the creation of the universe on the other.

The post A carnival of ideas in Kolkata appeared first on CERN Courier.

]]>
Meeting report The MMAP 2020 conference covered a mixture of low- to high-energy physics on the one hand and the cosmology of the creation of the universe on the other. https://cerncourier.com/wp-content/uploads/2023/07/CCJulAug23_FN_MMAP.jpg
A tribute to a great physicist https://cerncourier.com/a/a-tribute-to-a-great-physicist/ Wed, 05 Jul 2023 09:01:48 +0000 https://preview-courier.web.cern.ch/?p=108765 This interesting book also gives a good impression of how particle physics and physicists functioned over the past 70 years.

The post A tribute to a great physicist appeared first on CERN Courier.

]]>
Jack Steinberger

This book was written on the occasion of the 100th anniversary of the birth of Jack Steinberger. Edited by Jack’s former colleagues Weimin Wu and KK Phua with his daughter Julia Steinberger, it is a tribute to the important role that Jack played in particle physics at CERN and elsewhere, and also highlights many aspects of his life outside physics.

The book begins with a nice introduction by his daughter, herself a well-known scientist. She describes Jack’s family life, his hobbies, interests, passions and engagement, such as with the Pugwash conference series. The introduction is followed by a number of short essays by former friends and colleagues. The first is a transcript of an interview with Jack by Swapan Chattopadhyay in 2017. It contains recollections of Jack’s time at Fermilab, with his PhD supervisor Enrico Fermi, and concludes with his connections with Germany later in life.

Drive and leadership

The next essays highlight the essential impact that Jack had in all the experiments he participated in, mostly as spokesperson, and underline his original ideas, drive and leadership, not just professionally but also in his personal life. Stories include those by Hallstein Høgåsen, a fellow in the CERN theory department, who describes the determination and perseverance he had in mountaineering. S Lokanathan worked with Jack as a graduate student in the early 1950s in Nevis Labs and remained in contact with him, including later on when he became a professor in Jaipur. Jacques Lefrançois covers the ALEPH period, and Vera Luth the earlier kaon experiments at CERN. Italo Mannelli comments on both the early times when Jack visited Bologna to work with Marcello Conversi and Giampietro Puppi, and then turns to his work at the NA31 experiment on direct CP violation in the Ko system.

Gigi Rolandi emphasises the important role that Jack played in the design and construction of the ALEPH time projection chamber. Another good essay is by David N Schwartz, the son of Mel Schwartz who shared the Nobel prize with Jack and Leon Lederman. When David was born, Jack was Mel Schwartz’s thesis supervisor. As Jack was a friend of the Schwartz family, they were in regular contact all along. David describes how his father and Jack worked together and how, together with Leon Lederman, they started the famous muon neutrino experiment in 1959. As David Schwartz later became involved in arms control for the US in Geneva, he kept in contact with Jack, who had always been very passionate about arms control. David also remembers the great respect that Jack had for his thesis supervisor Enrico Fermi. The final essay is by Weimin Wu, one of the first Chinese physicists to join the international high-energy physics research community. Weimin started to work on ALEPH in 1979 and has remained a friend of the family since. He describes not only the important role that Jack played as a professor, mentor and role model, but also for establishing the link between ALEPH and the Chinese high-energy physics community.

Memorial Volume for Jack Steinberger

All these essays describe the enormous qualities of Jack as a physicist and as a leader. But they also highlight his social and human strengths. The reader gets a good feeling of Jack’s interests and hobbies outside of physics, such as music, climbing, skiing and sailing. Many of the essays are also accompanied by photographs, covering all parts of his life, and they are free from formulae or complicated physics explanations.

For those who want to go deeper into the physics that Jack was involved with, the second part of the book consists of a selection of his most important and representative publications, chosen and introduced by Dieter Schlatter. The first two papers from the 1950s deal with neutral meson production by photons and a possible detection of parity non-conservation in hyperon decays. They are followed by the Nobel prize-winning paper “Possible Detection of High-Energy Neutrino Interactions and the Existence of Two Kinds of Neutrinos” from 1962, three papers on CP violation in kaon decays at CERN (including first evidence for direct CP violation by NA31), then five important publications from the CDHS neutrino experiment (officially referred to as WA1) on inclusive neutrino and anti-neutrino interactions, charged-current structure functions, gluon distributions and more. Of course, the list would not be complete without a few papers from his last experiment, ALEPH, including the seminal one on the determination of the number of light neutrino species – a beautiful follow-up of Jack’s earlier discovery that there are at least two types of neutrinos.

This agreeable and interesting book will primarily appeal to those who have met or known Jack. But others, including younger physicists, will read the book with pleasure as it gives a good impression of how physics and physicists functioned over the past 70 years. It is therefore highly recommended.

The post A tribute to a great physicist appeared first on CERN Courier.

]]>
Review This interesting book also gives a good impression of how particle physics and physicists functioned over the past 70 years. https://cerncourier.com/wp-content/uploads/2023/07/CCJulAug23_REV_Steinberger_feature.jpg
CERN’s neutrino odyssey https://cerncourier.com/a/cerns-neutrino-odyssey/ Mon, 03 Jul 2023 13:37:13 +0000 https://preview-courier.web.cern.ch/?p=108736 The backstory and legacy of the Gargamelle collaboration’s epochal discovery of neutral currents 50 years ago.

The post CERN’s neutrino odyssey appeared first on CERN Courier.

]]>
The first candidate leptonic neutral-current event

The neutrino had barely been known for two years when CERN’s illustrious neutrino programme got under way. As early as 1958, the 600 MeV Synchro­cyclotron enabled the first observation of the decay of a charged pion into an electron and a neutrino – a key piece in the puzzle of weak interactions. Dedicated neutrino-beam experiments began a couple of years later when the Proton Synchrotron (PS) entered operation, rivalled by activities at Brookhaven’s higher-energy Alternating Gradient Synchrotron in the US. Producing the neutrino beam was relatively straightforward: make a proton beam from the PS hit an internal target to produce pions and kaons, let them fly some distance during which they can produce neutrinos when they decay, then use an iron shielding to filter the remaining hadrons, such that only neutrinos and muons remain. Ensuring that a new generation of particle detectors would enable the study of neutrino-beam interactions proved a tougher challenge. 

CERN began with two small, 1 m-long heavy-liquid bubble chambers that used proton beams which struck an internal target inside the PS, hoping to see at least one neutrino event per day. It was nowhere near that. Unfortunately the target configuration had made the beams about 10 times less intense than expected, and in 1961 CERN’s nascent neutrino programme came to a halt. “It was a big disappointment,” recalls Don Cundy, who was a young scientist at CERN at the time. “Then, several months later, Brookhaven did the same experiment but this time they put the target in the right place, and they discovered that there were two neutrinos – the muon neutrino (νµ) and the electron neutrino (νe) – a great discovery for which Lederman, Schwartz and Steinberger received the Nobel prize some 25 years later.” 

Despite this setback, CERN Director-General Victor Weisskopf, along with his Director of Research Gilberto Bernardini and the CERN team, decided to embark on an even more ambitious setup. Employing Simon van der Meer’s recently proposed “magnetic horn” – a high-current, pulsed focusing device placed around the target – and placing the target in an external beam pipe increased the neutrino flux by about two orders of magnitude. In 1963 this opened a new series of neutrino experiments at CERN. They began with a heavy-liquid bubble chamber containing around 500 kg of freon and a spark-chamber detector weighing several tonnes, for which first results were presented at a conference in Siena that year. The bubble-chamber results were particularly impressive, recalls Cundy: “Even though the number of events was of the order of a few hundred, you could do a lot of physics: measure the elastic form factor of the nucleon, single pion production, the total cross section, search for intermediate weak bosons and give limits on neutral-current processes.” It was at that conference that André Lagarrigue of Orsay urged that bubble chambers were the way forward for neutrino physics, and proposed to build the biggest chamber possible: Gargamelle, named after a giantess from a fictional renaissance story.

Magnetic horn

Construction in France of the much larger Gargamelle chamber, 4.8 m long and containing 18 tonnes of freon, was quick, and by the end of 1970 the detector was receiving a beam of muon neutrinos from the PS. The Gargamelle collaboration consisted of researchers from seven European institutes: Aachen, Brussels, CERN, École Polytechnique Paris, Milan, LAL Orsay and University College London. In 1969 the collaboration had made a list of physics priorities. Following the results of CERN’s Heavy Liquid Bubble Chamber, which set new limits on neutrino-electron scattering and single-pion neutral-current (NC) processes, the search for actual NC events made it onto the list. However, it only placed eighth out of 10 science goals. That is quite understandable, comments Cundy: “People thought that the most sensitive way to look for NCs was the decay of a K0 meson into two muons or two electrons but that had a very low branching ratio, so if NCs existed it would be at a very small level. The first thing on the list for Gargamelle was in fact looking at the structure of the nucleon, to measure the total cross section and to investigate the quark model.” 

Setting priorities

After the discovery of the neutrino in 1956 by Reines and Cowan (CERN Courier July/August 2016 p17), the weak interaction became a focus of nuclear research. The unification of the electromagnetic and weak interactions by Salam, Glashow and Weinberg a decade later motivated experiments to look for the electroweak carriers: the W boson, which mediates charged-current interactions, and the Z boson associated with neutral currents. While the former were known to exist by means of β decay, the latter were barely thought of. Neutral currents started to become interesting in 1971, after Martinus Veltman and Gerard ’t Hooft proved the renormalisability of the electroweak theory. 

More than 60 years after first putting the neutrino to work, CERN’s neutrino programme continues to evolve

By that time, Gargamelle was running at full speed. Analysing the photographs that were taken every time the PS was pulsed to look for interesting tracks were CERN personnel (at the time often referred to as “scanning girls”) who essentially performed the role of a modern level-1 trigger. Interactions were divided into different classes depending on the number of particles involved (muons, hadrons, electron–positron pairs, even one or more isolated protons as well as isolated electrons and positrons). The leptonic NC process (νµ + eνµ + e) would give an event that consisted of a single energetic electron. Since the background was very low, it would be the smoking gun for NCs. However, the cross-section was also very low, with only one to nine events expected from the electroweak calculations. The energetic hadronic NC event (νµ + N νµ + X, with the respective process involving antiparticles if the reaction was triggered by an antineutrino beam) would consist only of several hadrons, in fact just like events produced by incoming high-energy neutrons.

Gargamelle scanning table

“When the first leptonic event was found in December 1972 we were convinced that NCs existed,” says Gargamelle member Donatella Cavalli from the University of Milan. “It was just one event but with very low background, so a lot of effort was put into the search for hadronic NC events and in the full understanding of the background. I was the youngest in my group and I remember spending the evenings with my colleagues scanning the films on special projectors, which allowed us to observe the eight views of the chamber. I proudly remember my travels to Paris, London and Brussels, taking the photographs of the candidate events found in Milan to be checked with colleagues from other groups.”

At a CERN seminar on 19 July 1973, Paul Musset, who was one of the principal investigators, presented Gargamelle’s evidence for NCs based on both the leptonic and hadronic analyses. Results from the former had been published in a short paper received by Physics Letters two weeks earlier, while the paper on the hadronic events, which reported on the actual observation and hence confirmation of neutral currents, was received on 23 July. In August 1973 Gerald Myatt of  University College London, now at the University of Oxford, presented the results at the Electron-Photon conference. The papers were published in the same issue of the journal on 3 September. Yet many physicists doubted them. “It was generally believed that Gargamelle made a mistake,” says Myatt. “There was only one event, a tiny track really, and very low background. Still, it was not seen as conclusive evidence.” Among the critical voices were T D Lee, who was utterly unimpressed, and Jack Steinberger, who went as far as to bet half his wine cellar that the Gargamelle result would be wrong. 

The difficulty was to demonstrate that the hadronic NC signal was not due to background from neutral hadrons. “A lot of work and many different checks were done, from calculations to a full Monte Carlo simulation to a comparison between spatial distributions of charged- and neutral-current events,” explains Cavalli. “We were really happy when we published the first results from hadronic and leptonic NCs after all background checks, because we were confident in our results.” Initially the Gargamelle results were confirmed by the independent HPWF (Harvard–Pennsylvania–Wisconsin–Fermilab) experiment at Fermilab. Unfortunately, a problem with the HPWF setup led to their paper being rewritten, and a new analysis presented in November 1973 showed no sign of NCs. It was not until the following year that the modified HPWF apparatus and other experiments confirmed Gargamelle’s findings. 

André Lagarrigue

Additionally, the collaboration managed to tick off number two on its list of physics priorities: deep-inelastic scattering and scaling. Confirming earlier results from SLAC which showed that the proton is made of point-like constituents, Gargamelle data were crucial in proving that these constituents (quarks) have charges of +2/3 and –1/3. For neutral currents, the icing on the cake came 10 years after Gargamelle’s discovery with the direct discovery of the Z (and W) bosons at the SppS collider in 1983. The next milestone for CERN in understanding weak interactions came in 1990 with the precise measurement of the decay width of the Z boson at LEP, which showed that there are three and no more light neutrinos.

Legacy of a giantess

In 1977 Gargamelle was moved from the PS to the newly installed Super Proton Synchrotron (SPS). The following year, however, metal fatigue caused the chamber to crack and the experiment was decommissioned. Some of the collaboration members – including Cundy and Myatt – went to work on the nearby Big European Bubble Chamber. Also hooked up to the SPS for neutrino studies at that time were CDHS (CERN–Dortmund–Heidelberg–Saclay, officially denoted WA1) led by Steinberger, and Klaus Winter’s CHARM experiment. Operating for eight years, these large detectors collected millions of events that enabled precision studies on the structure of the charged and neutral currents as well as the structure of nucleons and the first evidence for QCD via scaling violations. 

The third type

The completion of the CHARM programme in 1991 marked the halt of neutrino operations at CERN for the first time in almost 30 years. But not for long. Experimental activities restarted with the search for neutrino oscillations, driven by the idea that neutrinos were an important component of dark matter in the universe. Consequently, two similarly styled short-baseline neutrino-beam experiments – CHORUS and NOMAD – were built. These next-generation detectors, which took data from 1994 to 1998 and from 1995 to 1998, respectively, joined others around the world to look for interactions of the third neutrino type, the ντ, and to search for neutrino oscillations, i.e. the change in neutrino flavour as they propagate, which was proposed in the 1950s and confirmed in 1998 by the SNO and Super-Kamiokande experiments in Canada and Japan. In 2000 the DONUT experiment at Fermilab reported the first direct evidence for ντ interactions. 

Gargamelle bubble chamber

CERN’s neutrino programme entered a hiatus until July 2006, when the SPS began firing an intense beam of muon neutrinos 732 km through Earth to two huge detectors – ICARUS and OPERA – located underground at Gran Sasso National Laboratory in Italy. Designed to make precision measurements of neutrino oscillations, the CERN Neutrinos to Gran Sasso (CNGS) programme observed the oscillation of muon neutrinos into tau neutrinos and was completed in 2012. 

As the CERN neutrino-beam programme was wound down, a brand-new initiative to support fundamental neutrino research began. “The initial idea for a ‘neutrino platform’ at CERN was to do a short-baseline neutrino experiment involving ICARUS to check the LSND anomaly, and another to test prototypes for “LBNO”, which would have been a European long-baseline neutrino oscillation experiment sending beams from CERN to Phyäsalmi in Finland to investigate the oscillation,” says Dario Autiero, who has been involved in CERN’s neutrino programme since the beginning of the 1980s. “The former was later decided to take place at Fermilab, while for the latter the European and US visions for long-baseline experiments found a consensus for what is now DUNE (the Deep Underground Neutrino Experiment) in the US.”

A unique facility

Officially launched in 2013 in scope of the update to the European strategy for particle physics, the CERN Neutrino Platform serves as a unique R&D facility for next-generation long-baseline neutrino experiments. Its most prominent project is the design, construction and testing of prototype detectors for DUNE, which will see a neutrino beam from Fermilab sent 1300 km to the SURF laboratory in Dakota. One of the Neutrino Platform’s early successes was the refurbishment of the ICARUS detector, which is now taking data at Fermilab’s short-baseline neutrino programme. The platform is also developing key technologies for the near detector for the Tokai-to-Kamioka (T2K) neutrino facility in Japan (see p10), and has a dedicated theory working group aimed at strengthening the connections between CERN and the worldwide neutrino community. Independently, the NA61 experiment at the SPS is contributing to a better understanding of neutrino–nucleon cross sections for DUNE and T2K data. 

Neutrino Platform at CERN’s North Area

More than 60 years after first putting the neutrino to work, CERN’s neutrino programme continues to evolve. In April 2023 a new experiment at the LHC called FASER made the first observation of neutrinos produced at a collider. Together with another new experiment, SND@LHC, FASER will enable the study of neutrinos in a new energy range and compare the production rate of all three types of neutrinos to further test the Standard Model. 

As for Gargamelle, today it lies next to BEBC and other retired colleagues in the garden of Square van Hove behind CERN’s main entrance. Not many can still retell the story of the discovery of neutral currents, but those who can share the story with delight “It was very tiny that first track from the electron, one in hundreds of thousands of pictures,” says Myatt. “Yet it justified André Lagarrigue’s vision of the large heavy-liquid bubble chamber as an ideal detector of neutrinos, combining large mass with a very finely detailed picture of the interaction. There can be no doubt that it was these features that enabled Gargamelle to make one of the most significant discoveries in the history of CERN.”

The post CERN’s neutrino odyssey appeared first on CERN Courier.

]]>
Feature The backstory and legacy of the Gargamelle collaboration’s epochal discovery of neutral currents 50 years ago. https://cerncourier.com/wp-content/uploads/2023/07/CCJulAug23_NEUTRAL_feature.jpg
The Cabibbo angle, 60 years later https://cerncourier.com/a/the-cabibbo-angle-60-years-later/ Mon, 24 Apr 2023 14:01:27 +0000 https://preview-courier.web.cern.ch/?p=108253 Nicola Cabibbo's short 1963 paper paved the way to the modern unification of electromagnetic and weak interactions. 

The post The Cabibbo angle, 60 years later appeared first on CERN Courier.

]]>
Nicola Cabibbo

In a 1961 book, Richard Feynman describes the great satisfaction he and Murray Gell-Mann felt in formulating a theory that explained the close equality of the Fermi constants for muon and neutron-beta decay. These two physicists and, independently, Gershtein and Zeldovich, had discovered the universality of weak interactions. It was a generalisation of the universality of electric charge and strongly suggested the existence of a common origin of the two interactions, an insight that was the basis for unified theories developed later. 

Fermi’s description of neutron beta decay (n → p+e+ νe) involved the product of two vector currents analogous to the electromagnetic current: a nuclear current transforming the neutron into a proton and a leptonic current creating the electron–antineutrino pair. Subsequent studies of nuclear decays and the discovery of parity violation complicated the description, introducing all possible kinds of relativistically invariant interactions that could be responsible for neutron beta decay. 

The decay of the muon (μ → νμ +e+ νe) was also found to involve the product of two vector currents, one transforming the muon into its own neutrino and the other creating the electron–antineutrino pair. What Feynman and Gell-Mann, and Gershtein and Zeldovich, had found is that the nuclear and lepton vector currents have the same strength, despite the fact that the n → p transition is affected by the strong nuclear interaction while μ → νμ and e → νe transitions are not (we are anticipating here what was discovered only later, namely that the electron and muon each have their own neutrino). 

At the end of the 1950s, simplicity finally emerged. As proposed by Sudarshan and Marshak, and by Feynman and Gell-Mann, all known beta decays are described by the products of two currents, each a combination of a vector and an axial vector current. Feynman notes: after 23 years, we are back to Fermi! 

The book of 1961, however, also records Feynman’s dismay after the discovery that the Fermi constants of strange-particle beta decays, for example the lambda–hyperon beta decay: Λ→ p+e+ νe were smaller by a factor of four or five than the theoretical prediction. In 1960 Gell-Mann, together with Maurice Lévy, had tried to solve the problem but, while taking a step in the right direction, they concluded that it was not possible to make quantitative predictions for the observed decays of the hyperons. It was up to Nicola Cabibbo, in a short article published in 1963 in Physical Review Letters, to reconcile strange-particle decays with the universality of weak interactions, paving the way to the modern unification of electromagnetic and weak interactions. 

Over to Frascati 

Nicola had graduated in Rome in 1958, under his tutor Bruno Touschek. Hired by Giorgio Salvini, he was the first theoretical physicist in the Electro-Synchrotron Frascati laboratories. There, Nicola met Raoul Gatto, five years his elder, who was coming back from Berkeley, and they began an extremely fruitful collaboration. 

These were exciting times in Frascati: the first e+ e collider, AdA (Anello di Accumulazione), was being realised, to be followed later by a larger machine, Adone, reaching up to 3 GeV in the centre of mass. New particles were studied at the electro-synchrotron, related to the newly discovered SU(3) flavour symmetry (e.g. the η meson). Cabibbo and Gatto authored an important article on e+ e physics and, in 1961, they investigated the weak interactions of hadrons in the framework of the SU(3) symmetry. Gatto and Cabibbo and, at the same time, Coleman and Glashow, observed that vector currents associated with the SU(3) symmetry by Noether’s theorem include a strangeness-changing current, V(ΔS = 1), that could be associated with strangeness-changing beta-decays in addition to the isospin current, V(ΔS = 0), responsible for strangeness-non-changing beta decays – the same considered by Feynman and Gell-Mann. For strange-particle decays, the identification implied that the variation of strangeness in the hadronic system has to be equal to the variation of the electric charge (in short: ΔS = ΔQ). The rule is satisfied in Λ beta decay (Λ: S = –1, Q = 0; p: S = 0, Q = +1). However, it conflicted with a single event allegedly observed at Berkeley in 1962 and interpreted as Σ+→ n + μ+ + νμ, which had ΔS = –ΔQ (Σ+: S = –1, Q = +1; n: S = Q = 0). In addition, the problem remained of how to correctly formulate the concept of muon-hadron universality in the presence of four vector currents describing the transitions e → νe, μ → νμ, n → p and Λ→ p.

Cabibbo’s angle

In his 1963 paper, written while he was working at CERN, Nicola made a few decisive steps. First, he decided to ignore the evidence of a ΔS = –ΔQ component suggested by Berkeley’s Σ+→ n+μ++νμ event. Nicola was a good friend of Paolo Franzini, then at Columbia University, and the fact that Paolo, with larger statistics, had not yet seen any such event provided a crucial hint. Next, to describe both ΔS = 0 and ΔS = 1 weak decays, Nicola formulated a notion of universality between each leptonic vector current (electronic or muonic) and one, and only one, hadronic vector current. He assumed the current to be a combination of the two currents determined by the SU(3) symmetry that he had studied with Gatto in Frascati (also identified by Coleman and Glashow): Vhadron = aV(ΔS = 0) + bV(ΔS = 1), with a and b being numerical constants. By construction, V(ΔS = 0) and V(ΔS = 1) have the same strength of the electron or of the muon currents; for the hadronic current to have the same strength, one requires a2 + b2 = 1, that is a = cosθ, b = sinθ. 

Cabibbo’s 1963 paper

Cabibbo obtained the final expression of the hadronic weak current, adding to these hypotheses the V–A formulation of the weak interactions. The angle θ became a new constant of nature, known since then as the Cabibbo angle. 

An important point is that the Cabibbo theory is based on the currents associated with SU(3) symmetry. For one, this means that it can be applied to the beta decays of all hadrons, mesons and baryons belonging to the different SU(3) multiplets. This was not the case for the precursory Gell-Mann–Lévy theory, which also assumed one hadron weak current but was formulated in terms of protons and lambdas, and could not be applied to the other hyperons or to the mesons. In addition, in the limit of exact SU(3) symmetry one can prove a non-renormalisation theorem for the ΔS = 1 vector current, which is entirely analogous to the one proved by Feynman and Gell-Mann for the ΔS = 0 isospin current. The Cabibbo combination, then, guarantees the universality of the full hadron weak current to the lepton current for any value of the Cabibbo angle, the suppression of the beta decays of strange particles being naturally explained by a small value of θ. Remarkably, a theorem derived by Ademollo and Gatto, and by Fubini a few years later, states that the non-renormalisation of the vector current’s strength is also valid to the first order in SU(3) symmetry breaking. 

Photons and quarks

In many instances, Nicola mentioned that a source of inspiration for his assumption for the hadron current was the passage of photons through a polarimeter, a subject he had considered in Frascati in connection with possible experiments of electron scattering through polarised crystals. Linearly polarised photons can be described via two orthogonal states, but what is transmitted is only the linear combination corresponding to the direction determined by the polarimeter. Similarly, there are two orthogonal hadron currents, V (ΔS = 0) and V (ΔS = 1), but only the Cabibbo combination couples to the weak interactions. 

An interpretation closer to particle physics came with the discovery of quarks. In quark language, V(ΔS = 0) induces the transition d → u and V(ΔS = 1) the transition s → u. The Cabibbo combination corresponds then to dC = (cos θd + sin θs) → u. Stated differently, the u quark is coupled by the weak interaction only to one, specific, superposition of d and s quarks: the Cabibbo combination dC. This is Cabibbo mixing, reflecting the fact that in SU(3) there are two quarks with the same charge –1/3. 

Testing quark mixing

A first comparison between theory and meson and hyperon beta-decay data was done by Cabibbo in his original paper, in the exact SU(3) limit. Specifically, the value of θ was obtained by comparing K+ and π+ semileptonic decays. In baryon semileptonic decays, the matrix elements of vector currents are determined by the SU(3) symmetry, while axial currents depend upon two parameters, the so-called F and D couplings. Many fits have been performed in successive years, which saw a dramatic increase in the decay modes observed, in statistics, and in precision. 

Four decades after the 1963 paper, Cabibbo, with Earl Swallow and Roland Winston, performed a complete analysis of hyperon decays in the Cabibbo theory, then embedded in the three-generation Kobayashi and Maskawa theory, taking into account the momentum dependence of vector currents. In their words (and in modern notation):
“… we obtain Vus = 0.2250(27) (= sin θ). This value is of similar precision, but higher than the one derived from Kl3, and in better agreement with the unitarity requirement,
|Vud |2 + |Vus|2 + |Vub |2 = 1. We find that the Cabibbo model gives an excellent fit of the existing form factor data on baryon beta decays (χ2 = 2.96) for three degrees of freedom with F + D = 1.2670 ± 0.0030, F–D = –0.341±0.016, and no indication of flavour SU(3) breaking effects.” 

The Cabibbo theory predicts a reduction in the nuclear Fermi constant squared with respect to the muonic one by a factor cos2 θ = 0.97. The discrepancy had been noticed by Feynman and S Berman, one of Feynman’s students, who estimated the possible effect of electromagnetic radiative corrections. The situation is much clearer today, with precise data coming from super-allowed Fermi nuclear transitions and radiative corrections under control.  

Closing up

From its very publication, the Cabibbo theory was seen as a crucial development. It indicated the correct way to embody lepton-hadron universality and it enjoyed a heartening phenomenological success, which in turn indicated that we could be on the right track for a fundamental theory of weak interactions. 

The idea of quark mixing had profound consequences. It prompted the solution of the spectacular suppression of strangeness-changing neutral processes by the GIM mechanism (Glashow, Iliopoulos and Maiani), where the charm quark couples to the combination of down and strange quarks orthogonal to the Cabibbo combination. Building on Cabibbo mixing and GIM, it has been possible to extend to hadrons the unified SU(2)L U(1) theory formulated, for leptons, by Glashow, and by Weinberg and Salam. 

There are very few articles in the scientific literature in which one does not feel the need to change a single word and Cabibbos is definitely one of them

CP symmetry violations observed experimentally had no place in the two-generation scheme (four quarks, four leptons) but found an elegant description by Makoto Kobayashi and Toshihide Maskawa in the extension to three generations. Quark mixing introduced by Cabibbo is now described by a three-by-three unitary matrix known in the literature as the Cabibbo–Kobayashi–Maskawa (CKM) matrix. In the past 50 years the CKM scheme has been confirmed with ever increasing accuracy by a plethora of measurements and impressive theoretical predictions (see “Testing quark mixing” figure). Major achievements have been obtained in the studies of charm- and beauty-particle decays and mixing. The CKM paradigm remains a great success in predicting weak processes and in our understanding of the sources of CP violation in our universe. 

Nicola Cabibbo passed away in 2010. The authoritative book by Abraham Pais, in its chronology, cites the Cabibbo theory among the most important developments in post-war particle physics. In the History of CERN, Jean Iliopoulos writes: “There are very few articles in the scientific literature in which one does not feel the need to change a single word and Cabibbo’s is definitely one of them. With this work, he established himself as one of the leading theorists in the domain of weak interactions.”

The post The Cabibbo angle, 60 years later appeared first on CERN Courier.

]]>
Feature Nicola Cabibbo's short 1963 paper paved the way to the modern unification of electromagnetic and weak interactions.  https://cerncourier.com/wp-content/uploads/2023/04/CCMayJun23_CABIBBO_lecture.jpg
New physics in b decays https://cerncourier.com/a/new-physics-in-b-decays/ Mon, 24 Apr 2023 12:52:46 +0000 https://preview-courier.web.cern.ch/?p=108319 The book "New Physics in b decays" offers a pedagogical approach for early-career researchers.

The post New physics in b decays appeared first on CERN Courier.

]]>
There are compelling reasons to believe that the Standard Model (SM) of particle physics, while being the most successful theory of the fundamental structure of the universe, does not offer the complete picture of reality. However, until now, no new physics beyond the SM has been firmly established through direct searches at different energy scales. This motivates indirect searches, performed by precision examination of phenomena sensitive to contributions from possible new particles, and comparing their properties with the SM expectations. This is conceptually similar to how, decades ago, our understanding of radioactive beta decay allowed the existence and properties of the W boson to be predicted.

New Physics in b decays, by Marina Artuso, Gino Isidori and the late Sheldon Stone, is dedicated to precision measurements in decays of hadrons containing a b quark. Due to their high mass, these hadrons can decay into dozens of different final states, providing numerous ways to challenge our understanding of particle physics. As is usual for indirect searches, the crucial task is to understand and control all SM contributions to these decays. For b-hadron decays, the challenge is to control the effects of the strong interaction, which is difficult to calculate.

Both sides of the coin

The authors committed to a challenging task: providing a snapshot of a field that has developed considerably during the past decade. They highlight key measurements that generated interest in the community, often due to hints of deviations from the SM expectations. Some of the reported anomalies have diminished since the book was published, after larger datasets were analysed. Others continue to intrigue researchers. This natural scientific progress leads to a better understanding of both the theoretical and experimental sides of the coin. The authors exercise reasonable caution over the significance of the anomalies they present, warning the reader of the look-elsewhere effect, and carefully define the relevant observables. When discussing specific decay modes, they explain their choice compared to other processes. This pedagogical approach makes the book very useful for early-career researchers diving into the topic. 

The book starts with a theoretical introduction to heavy-quark physics within the SM, plotting avenues for searches for possible new-physics effects. Key theoretical concepts are introduced, along with the experiments that contributed most significantly to the field. The authors continue with an overview of “traditional” new-physics searches, strongly interleaving them with precision measurements of the free parameters of the SM, such as the couplings between quarks and the W boson. By determining these parameters precisely with several alternative experimental approaches, one hopes to observe discrepancies. An in-depth review of the experimental measurements, also featuring their complications, is confronted with theoretical interpretations. While some of the discrepancies stand out, it is difficult to attribute them to new physics as long as alternative interpretations are not excluded.

New Physics in b Decays

The second half of the book dives into recent anomalies in decays with leptons, and the theoretical models attempting to address them. The authors reflect on theoretical and experimental work of the past decade and outline a number of pathways to follow. The book concludes with a short overview of searches for processes that are forbidden or extremely suppressed in the SM, such as lepton-flavour violation. These transitions, if observed, would represent an undeniable signature of new physics, although they only arise in a subset of new-physics scenarios. Such searches therefore allow strong limits to be placed on specific hypotheses. The book concludes with the authors’ view of the near future, which is already becoming reality. They expect the ongoing LHCb and Belle II experiments to have a decisive word on the current flavour anomalies, but also to deliver new, unexpected surprises. They rightly conclude that “It is difficult to make predictions, especially about the future.”

The remarkable feature of this book is that it is written by physicists who actively contributed to the development of numerous theoretical concepts and key experimental measurements in heavy-quark physics over the past decades. Unfortunately, one of the authors, Sheldon Stone, could not see his last book published. Sheldon was the editor of the book B decays, which served as the handbook on heavy-quark physics for decades. One can contemplate the impressive progress in the field by comparing the first edition of B decays in 1992 with New Physics in b decays. In the 1990s, heavy-quark decays were only starting to be probed. Now, they offer a well-oiled tool that can be used for precision tests of the SM and searches for minuscule effects of possible new physics, using decays that happen as rarely as once per billion b-hadrons.

The key message of this book is that theory and experiment must go hand in hand. Some parameters are difficult to calculate precisely and they need to be measured. The observables that are theoretically clean are often challenging experimentally. Therefore, the searches for new physics in b decays focus on processes that are accessible both from the theoretical and experimental points of view. The reach of such searches is constantly being broadened by painstakingly refining calculations and developing clever experimental techniques, with progress achieved through the routine work of hundreds of researchers in several experiments worldwide.

The post New physics in b decays appeared first on CERN Courier.

]]>
Review The book "New Physics in b decays" offers a pedagogical approach for early-career researchers. https://cerncourier.com/wp-content/uploads/2023/04/CCMayJun23_REV_Event.jpg
Cosmic rays for cultural heritage https://cerncourier.com/a/cosmic-rays-for-cultural-heritage/ Mon, 24 Apr 2023 12:52:11 +0000 https://preview-courier.web.cern.ch/?p=108244 Taking advantage of detectors used for particle physics, cosmogenic muons are becoming powerful tools for non-destructive imaging of large structures such as pyramids.

The post Cosmic rays for cultural heritage appeared first on CERN Courier.

]]>
In 1965, three years before being awarded a Nobel prize for his decisive contributions to elementary particle physics, Luis Alvarez proposed to use cosmic muons to look inside an Egyptian pyramid. A visit to the Giza pyramid complex a few years earlier had made him ponder why, despite the comparable size of the Great Pyramid of Khufu and the Pyramid of Khafre, the latter was built with a simpler structure – simpler even than the tomb of Khufu’s great-grandfather Sneferu, under whose reign there had been architectural experimentation and pyramids had grown in complexity. Only one burial chamber is known in the superstructure of Khafre’s pyramid, while two are located in the tombs of each of his two predecessors. Alvarez’s doubts were not shared by many archaeologists, and he was certainly aware that the history of architecture is not a continuous process and that family relationships can be complicated; but like many adventurers before him, he was fascinated by the idea that some hidden chambers could still be waiting to be discovered. 

The principles of muon radiography or “muography” were already textbook knowledge at that time. Muons are copiously produced in particle cascades originating from naturally occurring interactions between primary cosmic rays and atmospheric nuclei. The energy of most of those cosmogenic muons is large enough that, despite their relatively short intrinsic lifetime, relativistic dilation allows most of them to survive the journey from the upper atmosphere to Earth’s surface – where their penetration power makes them a promising tool to probe the depths of very large and dense volumes non-destructively. Thick and dense objects can attenuate the cosmic-muon flux significantly by stopping its low-energy component, thus providing a “shadow” analogous to conventional radiographies. The earliest known attempt to use the muon flux attenuation for practical purposes was the estimation of the overburden of a tunnel in Australia using Geiger counters on a rail, published in 1955 in an engineering journal. The obscure precedent was probably unknown to Alvarez, who didn’t cite it.

Led by Alvarez, the Joint Pyramid Project was officially established in 1966. The detector that the team built and installed in the known large chamber at the bottom of Khafre’s pyramid was based on spark chambers, which were standard equipment for particle-physics experiments at that time. Less common were the computers provided by IBM for Monte Carlo simulations, which played a crucial role in the data interpretation. It took some time for the project to take off. Just as the experiment was ready to take data, the Six-Day War broke out, delaying progress by several months until diplomatic relationships were restored between Cairo and Washington. All this might sound like a promising subject for a Hollywood blockbuster were it not for its anticlimax: no hidden chamber was found. Alvarez always insisted that there is a difference between not finding what you search for and conclusively excluding its existence, but despite this important distinction, one wonders how much muography’s fame would have benefitted from a discovery. Their study, published in Science in 1970, set an example that was followed in subsequent decades by many more interdisciplinary applications.  

The second pyramid to be muographed was in Mexico more than 30 years later, when researchers from the National Autonomous University of Mexico (UNAM) started to search for hidden chambers in the Pyramid of the Sun at Teotihuacan. Built by the Aztecs about 1800 years ago, it is the third largest pyramid in the world after Khufu’s and Khafre’s, and its purpose is still a mystery. Although there is no sign that it contains burial chambers, the hypothesis that this monument served as a tomb is not entirely ruled out. After more than a decade of data taking, the UNAM muon detector (composed of six layers of multi-wire chambers occupying a total volume of 1.5 m3) found no hidden chamber. But the researchers did find evidence, reported in 2013, for a very wide low-density volume in the southern side, which is still not understood and led to speculation that this side of the pyramid might be in danger of collapse.

Big void 

Muography returned to Egypt with the ScanPyramids project, which has been taking data since 2015. The project made the headlines in 2017 by revealing an unexpected low-density anomaly in Khufu’s Great Pyramid, tantalisingly similar in size and shape to the Grand Gallery of the same building. Three teams of physicists from Japan and France participated in the endeavour, cross-checking each other by using different detector technologies: nuclear emulsions, plastic scintillators and micromegas. The latter, being gaseous detectors, had to be located externally to the pyramid to comply with safety regulations. Publishing in Nature Physics, all three teams reported a statistically significant excess in muon flux originating from the same 3D position (see “Khufu’s pyramid” figure). 

Khufu’s pyramid

This year, based on a larger data sample, the Scan­Pyramids team concluded that this “Big Void” is a horizontal corridor about 9 m long with a transverse section of around 2 × 2 m2. Confidence in the solidity of these conclusions was provided by a cross-check measurement with ground-penetrating radar and ultrasounds, by Egyptian and German experts, which took data since 2020 and was published simultaneously. The consistency of the data from muography and conventional methods motivated visual inspection via an endoscope, confirming the claim. While the purpose of this unexpected feature of the pyramid is not yet known, the work represents the first characterisation of the position and dimensions of a void detected by cosmic-ray muons with a sensitivity of a few centimetres.

New projects exploring the Giza pyramids are now sprouting. A particularly ambitious project by researchers in Egypt, the US and the UK – Exploring the Great Pyramid (EGP) – uses movable large-area detectors to perform precise 3D tomography of the pyramid. Thanks to its larger surface and some methodological improvements, EGP aims to surpass ScanPyramids’ sensitivity after two years of data taking. Although still at the simulation studies stage, the detector technology – plastic scintillator bars with a triangular section and encapsulated wavelength shifter fibres – is already being used by the ongoing MURAVES muography project to scan the interior of the Vesuvius volcano in Italy. The project will also profit from synergy with the upcoming Mu2e experiment at Fermilab, where the very same detectors are used. Finally, proponents of the ScIDEP (Scintillator Imaging Detector for the Egyptian Pyramids) experiment from Egypt, the US and Belgium are giving Khafre’s pyramid a second look, using a high-resolution scintillator-based detector to take data from the same location as Alvarez’s spark chambers.

Muography data in the Xi’an city walls

Pyramids easily make headlines, but there is no scarcity of monuments around the world where muography can play a role. Recently, a Russian team used emulsion detectors to explore the Svyato–Troitsky Danilov Monastery, the main buildings of which have undergone several renovations across the centuries but with associated documentation lost. The results of their survey, published in 2022, include evidence for two unknown rooms and areas of significantly higher density (possible walls) in the immured parts of certain vaults, and of underground voids speculated to be ancient crypts or air ducts. Muography is also being used to preserve buildings of historical importance. The defensive wall structures of Xi’an, one of the Four Great Ancient Capitals of China, suffered serious damage due to heavy rainfall, but repairs in the 1980s were insufficiently documented, motivating non-destructive techniques to assess their internal status. Taking data from six different locations using a compact and portable muon detector to extract a 3D density map of a rampart, a Chinese team led by Lanzhou University has recently reported density anomalies that potentially pose safety hazards (see “Falling walls” figure). 

The many flavours of muography

All the examples described so far are based on the same basic principle as Alvarez’s experiment: the attenuation of the muon flux through dense matter. But there are other ways to utilise muons as probes. For example, it is possible to exploit their deflection in matter due to Coulomb scattering from nuclei, offering the possibility of elemental discrimination. Such muon scattering tomography (MST) has been proposed to help preserve the Santa Maria del Fiore cathedral in Florence, built between 1420 and 1436 by Filippo Brunelleschi, the iconic dome of which is cracking under its own weight. Accurate modelling is needed to guide reinforcement efforts, but uncertainties exist on the internal structure of the walls. According to some experts, Brunelleschi might have inserted iron chains inside the masonry of the dome to stabilise it; however, no conclusive evidence has been obtained with traditional remote-sensing methods. Searching for iron within masonry is therefore the goal of the proposed experiment (see “Preserving a masterpiece” figure), for which a proof-of-principle test on a mock-up wall has already been carried out in Los Alamos.

Beyond cultural heritage, muography has also been advocated as a powerful remote-sensing method for a variety of applications in the nuclear sector. It has been used, for example, to assess the damage and impact of radioactivity in the Fukushima power plant, where four nuclear reactors were damaged in 2011. Absorption-based muography was applied to determine the difference in the density, for example the thickness of the walls, within the nuclear reactor while MST was applied to locate the nuclear fuel. Muography, especially MST, has allowed the investigation of other extreme systems, including blast furnaces and nuclear waste barrels. 

Santa Maria del Fiore cathedral

Volcanology is a further important application of muography, where it is used to discover empty magma chambers and voids. As muons are better absorbed by thick and dense objects, such as rocks on the bottom of a volcano, the absorption provides key information about its inner structure. The density images created via muography can even be fed into machine-learning models to help predict eruptive patterns, and similar methods can be applied to glaciology, as has been done to estimate the topography of mountains hidden by overlaying glaciers. Among these projects is Eiger-μ, designed to explore the mechanisms of glacial erosion.

Powerful partnership 

Muography creates bridges across the world between particle physics and cultural-heritage preservation. The ability to perform radiography of a large object from a distance or from pre-existing tunnels is very appealing in situations where invasive excavations are impossible, as is often the case in highly populated urban or severely constrained areas. Geophysical remote-sensing methods are already part of the archaeological toolkit, but in general they are expensive, have a limited resolution and demand strong model assumptions for interpreting the data. Muography is now gaining acceptance in the cultural-heritage preservation world because its data are intrinsically directional and can be easily interpreted in terms of density distributions.

From the pioneering work of Alvarez to the state-of-the-art systems available today, progress in muography has gone hand-in-hand with the development of detectors for particle physics. The ScanPyramids project, for example, uses micropattern gaseous detectors such as those developed within the CERN RD51 collaboration and nuclear emulsion detectors as those of the OPERA neutrino experiment, while the upcoming EGP project will benefit from detector technologies for the Mu2e experiment at Fermilab. R&D for next-generation muography includes the development of scintillator-based muon detectors, resistive plate chambers, trackers based on multi-wire proportional chambers and more. There are proposals to use microstrip silicon detectors from the CMS experiment and Cherenkov telescopes inspired by the CTA astrophysics project, showing how R&D for fundamental physics continues to drive exotic applications in archaeology and cultural-heritage preservation.

The post Cosmic rays for cultural heritage appeared first on CERN Courier.

]]>
Feature Taking advantage of detectors used for particle physics, cosmogenic muons are becoming powerful tools for non-destructive imaging of large structures such as pyramids. https://cerncourier.com/wp-content/uploads/2023/04/CCMayJun23_MUOG_pyramids.jpg
Event celebrates 50 years of Kobayashi–Maskawa theory https://cerncourier.com/a/event-celebrates-50-years-of-kobayashi-maskawa-theory/ Fri, 21 Apr 2023 09:15:28 +0000 https://preview-courier.web.cern.ch/?p=108308 150 participants from all over the world gathered to mark the generalisation of quark mixing to three generations and its implications today.

The post Event celebrates 50 years of Kobayashi–Maskawa theory appeared first on CERN Courier.

]]>
Quarks change their flavour through the weak interaction, and the strength of the flavour mixing is parametrised by the Cabibbo–Kobayashi–Maskawa (CKM) matrix, which is an essential part of the Standard Model. This year marks the 60th anniversary of Nicola Cabibbo’s paper describing the mixing between down and strange quarks. It also marks the 50th anniversary of the paper by Makoto Kobayashi and Toshihide Maskawa, published in February 1973, which explained the origin of CP violation by generalising the quark mixing to three generations. To celebrate the magnificent accomplishments of quark-flavour physics during the past 50 years and to discuss the future of this important topic, a symposium was held at KEK in Tsukuba, Japan on 11 February, attracting about 150 participants from around the globe, including Makoto Kobayashi himself.

Opening the event, Masanori Yamauchi, director-general of KEK, summarised the early history of Kobayashi-Maskawa (KM) theory and the ideas to test it as a theory of CP violation. He recalled his time as a member of the Belle collaboration at the KEKB accelerator, including the memorable competition with the BaBar experiment at SLAC during the late 1990s and early 2000s, which finally led to the conclusion that KM theory explains the observed CP violation. Kobayashi and Maskawa shared one half of the 2008 Nobel Prize in Physics “for the discovery of the origin of the broken symmetry which predicts the existence of at least three families of quarks in nature”.

The scientific sessions were initiated by Amarjit Soni (BNL), who summarised various ideas to measure CP violation from cascade decays of B mesons including the celebrated papers by A I Sanda and co-workers in 1980–1981, which gave a strong motivation to build B factories. Stephen Olsen (Chung Ang University), who was one of the leaders of the Belle collaboration, looked back at the situation in the early 1980s when B-meson mixing was first observed, and emphasised the role of the accelerator physicists who achieved the 100-fold increase in luminosity that was necessary to measure CP angles. Adrian Bevan (Queen Mary University of London) added a perspective from the BaBar experiment, while the more recent impressive development by the LHCb experiment was summarised by Patrick Koppenburg (Nikhef).

Theoretical developments remain an integral part of quark-flavour physics. Matthias Neubert (University of Mainz) gave an overview of the theoretical tools developed to understand B-meson decays, which include heavy-quark symmetry, heavy-quark effective field theory, heavy-quark expansion and QCD factorisation, and Zoltan Ligeti (LBNL) summarised concurrent developments of theory and experiment to determine the sides of the CKM triangle. Lattice QCD also played a central role in the determination of the CKM matrix elements by providing precision computation of non-perturbative parameters, as discussed by Aida El-Khadra (University of Illinois).

There are valuable lessons from the KM paper when applied to the search beyond the Standard Model

The B sector is not the only place where CP violation is observed. Indeed, it was first observed in kaon mixings, and important pieces of information have been obtained since then. A number of theoretical ideas dedicated to the study of kaon CP violation were discussed by Andrzej Buras (Technical University of Munich), and experimental projects were overviewed by Taku Yamanaka (Osaka University).

There are still unsolved mysteries around quark-flavour physics. The most notable is the origin of the fermion generations, which may only be understood by accumulating more data to find any discrepancy with the Standard Model. SuperKEKB/Belle II, the successor of KEKB/Belle, plans to accumulate 50 times more data in the coming decades, while LHCb will continue to improve the precision of measurement in hadronic collisions. Nanae Taniguchi (KEK) reported the current status of SuperKEKB/Belle II, which has been in physics operation since 2019 and has already broken peak-luminosity records in e+e collisions. Gino Isidori (University of Zurich) gave his view on the possible shape of physics to come. “There are valuable lessons from the KM paper, which are still valuable today, when applied to the search beyond the Standard Model,” he concluded. 

As a closing remark, Makoto Kobayashi reminisced about the time when he built the theory as well as the time when the KEKB/Belle experiment was running. “I was able to watch the development of the B factory so closely from the very beginning,” he said. “I am grateful to the colleagues who gave me such a great opportunity.”

The post Event celebrates 50 years of Kobayashi–Maskawa theory appeared first on CERN Courier.

]]>
Meeting report 150 participants from all over the world gathered to mark the generalisation of quark mixing to three generations and its implications today. https://cerncourier.com/wp-content/uploads/2023/04/CCMayJun23_FN_KM50.jpg
Unconventional music @ CERN https://cerncourier.com/a/unconventional-music-cern/ Fri, 03 Mar 2023 12:06:45 +0000 https://preview-courier.web.cern.ch/?p=107980 Honouring the 100th anniversary of Einstein’s Nobel prize, the Swedish embassy in Bern collaborated with CERN for an event connecting physics and music.

The post Unconventional music @ CERN appeared first on CERN Courier.

]]>
Honouring the 100th anniversary of Einstein’s Nobel prize, the Swedish embassy in Bern collaborated with CERN for an event connecting science and music, held at the CERN Globe of Science and Innovation on 19 October. The event was originally planned for 2021 but was postponed due to the pandemic.

Brian Foster (University of Oxford) talked about Einstein’s love for music and playing the violin, which was underlined with many photos showing Einstein with some of the well-known violinists of the time. Around the period Einstein was awarded the Nobel prize, Russian engineer Lev Termen invented the theremin, consisting of two antennae and played without physical contact. This caught Einstein’s attention and it is said that he even played the theremin himself once.

Delving further into the unconventional, LHC physicists performed Domenico Vicinanza’s (GEANT and Anglia Ruskin University) “Sonification of the LHC”, for which the physicist-turned composer mapped data recorded by the LHC experiments between 2010 and 2013 into music. First performed in 2014 on the occasion of CERN’s 60th anniversary, Vicinanza’s piece is intended as a metaphor for scientific cooperation, in which different voices and perspectives can reach the same goal only by playing together.

There followed the debut of an even more unconventional piece of music by The Stone Martens – a Swiss and Swedish “noise collaboration” improvised by Henrik Rylander and Roland Bucher. By sending the output of his theremin through guitar-effects pedals, Rylander created a unique sound. Together with Bucher’s self-made “noise table”, with which he sampled acoustic instruments and everyday objects, the duo created a captivating, otherworldly sound collage that was well received by the 160-strong audience. The event closed with an unconventional Bach concerto for two violins in which these unique sounds were fused with traditional instruments. Anyone interested in experiencing the music for themselves can find a recorded version at https://indico.cern.ch/event/1199556/.

The post Unconventional music @ CERN appeared first on CERN Courier.

]]>
Review Honouring the 100th anniversary of Einstein’s Nobel prize, the Swedish embassy in Bern collaborated with CERN for an event connecting physics and music. https://cerncourier.com/wp-content/uploads/2023/02/CCMarApr23_REV_music2.jpg
Remembering the W discovery https://cerncourier.com/a/remembering-the-w-discovery/ Tue, 10 Jan 2023 12:14:34 +0000 https://preview-courier.web.cern.ch/?p=107547 Former UA2 spokesperson Luigi Di Lella recalls the events leading to the discovery of the W and Z bosons at CERN 40 years ago.

The post Remembering the W discovery appeared first on CERN Courier.

]]>
A W event recorded by UA1 in 1982

When the W and Z bosons were predicted in the mid-to-late 1960s, their masses were not known. Experimentalists therefore had no idea what energy they needed to produce them. That changed in 1973, when Gargamelle discovered neutral-current neutrino interactions and measured the cross-section ratio between neutral- and charged-current interactions. This ratio provided the first direct determination of the weak mixing angle, which, via the electroweak theory, predicted the W-boson mass to lie between 60 and 80 GeV, and the Z mass between 75 and 95 GeV – at least twice the energy of the leading accelerators of the day. 

By then, the world’s first hadron collider – the Intersecting Storage Rings (ISR) at CERN – was working well. Kjell Johnsen proposed a new superconducting ISR in the same tunnel, capable of reaching 240 GeV. A study group was formed. Then, in 1976, Carlo Rubbia, David Cline and Peter McIntyre suggested adding  an antiproton source to a conventional 400 GeV proton accelerator, either at Fermilab or at CERN, to transform it into a pp collider. The problem was that the antiprotons had to be accumulated
and cooled if the target luminosity (1029 cm–2s–1, providing about one Z event per day) was to be reached. Two methods were proposed: stochastic cooling by Simon van der Meer at CERN and electron cooling by Gersh Budker in Novosibirsk. 

CERN Director-General John Adams wasn’t too happy that as soon as the SPS had been built, physicists wanted to convert it into a pp collider. But he accepted the suggestion, and the idea of a superconducting ISR was abandoned. Following the Initial Cooling Experiment, which showed that the luminosity target was achievable with stochastic cooling, the SppS was approved in May 1978 and the construction of the Antiproton Accumulator (AA) by van der Meer and collaborators began. Around that time, the design of the UA1 experiment was also approved. 

A group of us proposed a second, simpler experiment in another interaction region (UA2), but it was put on hold for financial reasons. Then, at the end of 1978, Sam Ting proposed an experiment to go in the same place. His idea was to surround the beam with heavy material so that everything would be absorbed except for muons, making it good at identifying Z → μ+μ but far from good for W bosons decaying to a muon and a neutrino. In a tense atmosphere, Ting’s proposal was turned down and ours was approved.

First sightings

The first low-intensity pp collisions arrived in late 1981. In December 1982 the luminosity reached a sufficient level, and by the following month UA1 had recorded six W candidates and UA2 four. The background was minimal; there was nothing else we could think of that would produce such events. Carlo presented the UA1 events and Pierre Darriulat the UA2 ones at a workshop in Rome on 12–14 January 1983. On 20 January, Carlo announced the W discovery at a CERN seminar, and the next day I presented the UA2 results, confirming UA1. In UA2 we never discussed priority, because we all knew that it was Carlo who had made the whole project possible. 

Luigi Di Lella

The same philosophy guided the discovery of the Z boson. UA2 had recorded a candidate Z → e+e event in December 1982, also presented by Pierre at the Rome workshop. One electron was perfectly clear, whereas the other had produced a shower with many tracks. I had shown the event to Jack Steinberger, who strongly suggested we publish immediately; however, we decided to wait for the first “golden” event with both electrons unambiguously identified. Then, one night in May 1983, UA1 found a Z. As with ours, only one electron satisfied all electron-identification criteria, but Carlo used the event to announce a discovery. The UA1 results (based on four Z → e+e events and one Z → μ+μ) were published that July, followed by the UA2 results (based on eight Z → e+e events, including the 1982 one) a month later. 

The SppS ran until 1990, when it became clear that Fermilab’s Tevatron was going to put us out of business. In 1984–1985 the energy was increased from 546 to 630 GeV and in 1986 another ring was added to the AA, increasing the luminosity 10-fold. Following the 1984 Nobel prize to Rubbia and van der Meer, UA1 embarked on an ambitious new electromagnetic calorimeter that never quite worked. UA2 went on to make a precise measurement of the ratio mW/mZ, which, along with the first precise measurement of mZ at LEP, enabled us to determine the W mass with 0.5% precision and, via radiative corrections, to predict the mass of the top quark (160+50–60 GeV) several years before the Tevatron discovered it. 

Times have certainly changed since then, but the powerful interplay between theory, experiment and machine builders remains essential for progress in particle physics. 

The post Remembering the W discovery appeared first on CERN Courier.

]]>
Opinion Former UA2 spokesperson Luigi Di Lella recalls the events leading to the discovery of the W and Z bosons at CERN 40 years ago. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_VIEW_UA1_feature.jpg
50 Years of Theoretical Physics https://cerncourier.com/a/50-years-of-theoretical-physics/ Tue, 10 Jan 2023 11:45:17 +0000 https://preview-courier.web.cern.ch/?p=107651 This carefully crafted edition highlights the scientific life of Frank Wilczek, and the developments in theoretical physics related to the 2004 Nobel laureate's research.

The post 50 Years of Theoretical Physics appeared first on CERN Courier.

]]>
Frank Wilczek: 50 Years of Theoretical Physics

This carefully crafted edition highlights the scientific life of 2004 Nobel laureate Frank Anthony Wilczek, and the developments of theoretical physics related to his research. Frank Wilczek: 50 Years of Theoretical Physics is a collection of essays, original research papers and the reminiscences of Wilczek’s friends, students and followers. Wilczek is an exceptional physicist with an extraordinary mathematical talent. The 23 articles represent his vivid research journey from pure particle physics to cosmology, quantum black holes, gravitation, dark matter, applications of field theory to condensed matter physics, quantum mechanics, quantum computing and beyond.

In 1973 Wilczek discovered, together with his doctoral advisor David Gross, asymptotic freedom through which the field theory of the strong interaction, quantum chromodynamics (QCD), was firmly established. Independently that year, the same work was done by David Politzer, and all three shared the Nobel prize in 2004. Wilczek’s major work includes the solution of the strong-CP problem by predicting the hypothetical axion, a result of the spontaneously broken Peccei–Quinn symmetry. In 1982 he predicted the quasiparticle “anyon”, for which evidence was found in a 2D electronic system in 2020. This satisfies the need for a new variant for 2D systems as the properties of fermions and bosons are not transferable. 

Original research papers included in this book were written by pioneering scientists, such as Roman Jackiw and Edward Witten, who are either co-inventors or followers of Wilczek’s work. The articles cover recent developments of QCD, quantum-Hall liquids, gravitational waves, dark energy, superfluidity, the Standard Model, symmetry breaking, quantum time-crystals, quantum gravity and more. Many colour photographs, musical tributes to anyons, memories of quantum-connection workshops and his contribution to the Tsung-Dao Lee Institute in Shanghai complement the volume. The book ends with Wilczek’s publication list, which documents the most significant developments in theoretical particle physics during the past 50 years.

Wilczek is an exceptional physicist with an extraordinary mathematical talent

Though this book is not an easy read in places, and the connections between articles are not always clear, a patient and careful reader will be rewarded. The collection combines rigorous scientific discussions with an admixture of Wilczek’s life, wit, scientific thoughts and teaching – a precious and timely tribute to an exceptional physicist.

The post 50 Years of Theoretical Physics appeared first on CERN Courier.

]]>
Opinion This carefully crafted edition highlights the scientific life of Frank Wilczek, and the developments in theoretical physics related to the 2004 Nobel laureate's research. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_REV_wilczek_feature.jpg
From dreams to beams: SESAME’s 30 year-long journey in science diplomacy https://cerncourier.com/a/from-dreams-to-beams-sesames-30-year-long-journey-in-science-diplomacy/ Mon, 09 Jan 2023 14:27:17 +0000 https://preview-courier.web.cern.ch/?p=107609 SESAME founder Eliezer Rabinovici describes the story behind this beacon for peaceful international collaboration, what its achievements have been, and what the future holds.

The post From dreams to beams: SESAME’s 30 year-long journey in science diplomacy appeared first on CERN Courier.

]]>
The SESAME booster and storage ring

SESAME (Synchrotron-light for Experimental Science and Applications in the Middle East) is the Middle East’s first major international research centre. It is a regional third-generation synchrotron X-ray source situated in Allan, Jordan, which broke ground on 6 January 2003 and officially opened on 16 May 2017. The current members of SESAME are Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey. Active current observers include, among others: the European Union, France, Germany, Greece, Italy, Japan, Kuwait, Portugal, Spain, Sweden, Switzerland, the UK and the US. The common vision driving SESAME is the belief that human beings can work together for a cause that furthers the interests of their own nations and that of humanity as a whole. 

The story of SESAME started at CERN 30 years ago. One day in 1993, shortly after the signature of the Oslo Accords by Israel and the Palestine Liberation Organization, the late Sergio Fubini, an outstanding scientist and a close friend and collaborator, approached me in the corridor of the CERN theory group. He told me that now was the time to test what he called “your idealism”, referring to future joint Arab–Israeli scientific projects. 

CERN is a very appropriate venue for the inception of such a project. It was built after World War II to help heal Europe and European science in particular. Abdus Salam, as far back as the 1950s, identified the light source as a tool that could help thrust what were then considered “third-world” countries directly to the forefront of scientific research. The very same Salam joined our efforts in 1993 as a member of the Middle Eastern Science Committee (MESC), founded by Sergio, myself and many others to forge meaningful scientific contacts in the region. By joining our scientific committee, Salam made public his belief in the value of Arab–Israeli scientific collaborations, something the Nobel laureate had expressed several times in private.

Participants of the SESAME users’ meeting

To focus our vision, that year I gave a talk on the status of Arab–Israeli collaborations at a meeting in Torino held on the occasion of Sergio’s 65th birthday. Afterwards we travelled to Cairo to meet Venice Gouda, the Egyptian minister for higher education, and other Egyptian officials. At that stage we were just self-appointed entrepreneurs. We were told that president Hosni Mubarak had made a decision to take politics out of scientific collaborations with Israel, so together we organized a high-quality scientific meeting in Dahab, in the Sinai desert. The meeting, held in a large Bedouin tent on 19-26 November 1995, brought together about 100 young and senior scientists from the region and beyond. It took place in the weeks after the murder of the Israeli prime minister Yitzhak Rabin, for whom, at the request of Venice Gouda, all of us stood for a moment of silence in respect. The silence echoes in my ears to this day. The first day of the meeting was attended by Jacob Ziv, president of the Israeli Academy of Sciences and Humanities, which had been supporting such efforts in general. It was thanks to the additional financial help of Miguel Virasoro, director-general of ICTP at the time, and also Daniele Amati, director of SISSA, that the meeting was held. All three decisions of support were made at watershed moments and on the spur of the moment. The meeting was followed by a very successful effort to identify concrete projects in which Arab–Israeli collaboration could be beneficial to both sides. 

But attempts to continue the project were blocked by a turn for the worse in the political situation. MESC decided to retreat to Torino, where, during a meeting in November 1996, there was a session devoted to studying the possibilities of cooperation via experimental activities in high-energy physics and light-source science. During that session, the late German scientist Gus Voss suggested (on behalf of himself and Hermann Winnick from SLAC) to bring the parts of a German light source situated in Berlin, called BESSY, which was about to be dismantled, to the Middle East. Former Director-General of CERN Herwig Schopper also attended the workshop. MESC had built sufficient trust among the parties to provide an appropriate infrastructure to turn such an idea into something concrete. 

Targeting excellent science 

A light source was very attractive thanks to the rich diversity of fields that can make use of such a facility, from biology through chemistry, physics and many more to archaeology and environmental sciences. Such a diversity would also allow the formation of a critical mass of real users in the region. The major drawback of the BESSY-based proposal was that there was no way a reconstructed dismantled “old” machine would be able to attract first-class scientists and science. 

Around that time, Fubini asked Schopper, who had a rich experience in managing complex experimental projects, to take a leadership position. The focus of possible collaborations was narrowed down to the construction of a large light source, and it was decided to use the German machine as a nucleus around which to build the administrative structure of the project. The non-relations among several of the members presented a serious challenge. At the suggestion of Schopper, following the example of the way CERN was assembled in the 1950s, the impasse was overcome by using the auspices of UNESCO to deposit the instruments for joining the project. The statutes of SESAME were to a large extent copied from those of CERN. A band of self-appointed entrepreneurs had evolved into a self-declared interim Council of SESAME, with Schopper as its president. The next major challenge was to choose a site.

SESAME beginnings

On 15 March 2000 I flew to Amman for a meeting on the subject. I met Khaled Toukan (the current director-general of SESAME) and, after studying a map sold at the hotel where we met, we discussed which site Israel would support. We also asked that a Palestinian be the director general. Due to various developments, none of which depended on Israel, this was not to happen. The decision on the site venue was taken at a meeting at CERN on 11 April 2000. Jordan, which had and has diplomatic relations with all the parties involved, was selected as the host state. BESSY was dismantled by Russian scientists, placed in boxes and shipped with assembly instructions to the Jordanian desert to be kept until the appropriate moment would arise. This was made possible thanks to a direct contribution by Koichiro Matsuura, director-general of UNESCO at the time, and to the efforts of Khaled Toukan who has served in several ministerial capacities in Jordan. 

With the administrative structure in place, it was time to address the engineering and scientific aspects of the project. Technical committees had designed a totally new machine, with BESSY serving as a boosting component. Many scientists in the region were introduced via workshops to the scientific possibilities that SESAME could offer. Scientific committees considered appropriate “day-one” beamlines, yet that day seemed very far in the future. Technical and scientific directors from abroad helped define the parameters of a new machine and identified appropriate beamlines to be constructed. Administrators and civil servants from the members started meeting regularly in the finance committee. Jordan began to build the facility to host the light source and made major additional financial contributions. 

Transformative agreements

At this stage it was time for the SESAME interim council to transform into a permanent body and in the process cut its umbilical cord from UNESCO. This transformation presented new hurdles because it was required of every member that wished to become a member of the permanent council that its head of state, or someone authorised by the head of state, sign an official document sent to UNESCO stating this wish. 

By 2008 the host building had been constructed. But it remained essentially empty. SESAME had received support from leading light-source labs all over the world – a spiritual source of strength to members to continue with the project. However, attempts to get significant funding failed time and again. It was agreed that the running costs of the project should be borne by the members, but the one-time large cost needed to construct a new machine was outside the budget parameters of most of the members, many of whom did not have a tradition of significant support for basic science. The European Union (EU) supported us in that stage only through its bilateral agreement with Jordan. In the end, several million Euros from those projects did find their way to SESAME, but the coffers of SESAME and its infrastructure remained skeletal.

Changing perceptions

In 2008 Herwig Schopper was succeeded by Chris Llewellyn Smith, another former Director-General of CERN, as president of the SESAME Council. His main challenge was to get the funding needed to construct a new light source and to remove from SESAME the perception that it was simply a reassembled old light source of little potential attraction to top scientists. In addition to searching for sources of significant financial support, there was an enormous amount of work still to be done in formulating detailed and realistic plans for the following years. A grinding systematic effort began to endow SESAME with the structure needed for a modern working accelerator, and to create associated information materials.

Llewellyn Smith, like his predecessor, also needed to deal with political issues. For the most part the meetings of the SESAME Council were totally devoid of politics. In fact, they felt to me like a parallel universe where administrators and scientists from the region get to work together in a common project, each bringing her or his own scars and prejudices and each willing to learn. That said, there were moments when politics did contaminate the spirit forming in SESAME. In some cases, this was isolated and removed from the agenda and in others a bitter taste remains. But these are just at the very margins of the main thrust of SESAME. 

Students, beamline scientists and magnets

The empty SESAME building started to be filled with radiation shields, giving the appearance of a full building. But the absence of the light-source itself created a void. The morale of the local staff was in steady decline, and it seemed to me that the project was in some danger. I decided to approach the ministry of finance in Israel. When I asked if Israel would make a voluntary contribution to SESAME of $5 million, I was not shown the door. Instead they requested to come and see SESAME, after which they discussed the proposal with Israel’s budget and planning committee and agreed to contribute the requested funds on the condition that others join them. 

Each member of the unlikely coalition – consisting of Iran, Israel, Jordan and Turkey – pledged an extra $5 million for the project in an agreement signed in Amman. Since then, Israel, Jordan and Turkey have stood up to their commitment, and Iran claims that it recognises its commitment but is obstructed by sanctions. The support from members encouraged the EU to dedicate $5 million to the project, in addition to the approximately $3 million directed earlier from a bilateral EU–Jordan agreement. In 2015 the INFN, under director Fernando Ferroni, gave almost $2 million. This made it possible to build a hostel, as offered by most light sources, which was named appropriately after Sergio Fubini. Many leading world labs, in a heartwarming expression of support, have donated equipment for future beam lines as well as fellowships for the training of young people.

Point of no return

With their help, SESAME crossed the point of no return. The undefined stuff dreams are made of turned into magnets and girdles made of real hard steel, which I was able to touch as they were being assembled at CERN. The pace of events had finally accelerated, and a star-studded inauguration including attendance by the king of Jordan took place on 16 May 2017. During the ceremony, amazingly, the political delegates of different member states listened to each other without leaving the room (as is the standard practice in other international organisations). Even more unique was that each member-state delegate taking the podium gave essentially the same speech: “We are trying here to achieve understanding via collaboration.”

At that moment the SESAME Council presidency passed from Chris Llewellyn Smith to a third former CERN Director-General, Rolf Heuer. The high-quality 2.5 GeV electron storage ring at the heart of SESAME started operation later that year, driving two X-ray beamlines: one dedicated to X-ray absorption fine structure/X-ray fluorescence (XAFS/XRF) spectroscopy, and another to infrared spectro-microscopy. A third powder-diffraction beamline is presently being added, while a soft X-ray beamline “HESEB” designed and constructed by five Helmholtz research centres is being commissioned. In 2023 the BEAmline for Tomography at SESAME (BEATS) will also be completed, with the construction and commissioning of a beamline for hard X-ray full-field tomography. 

The unique SESAME facility started operating with uncanny normality. Well over 100 proposals for experiments were submitted and refereed, and beam time was allocated to the chosen experiments. Data was gathered, analysed and the results were and are being published in first-rate journals. Given the richness of archaeological and cultural heritage in the region, SESAME’s beamlines offer a highly versatile tool for researchers, conservators and cultural-heritage specialists to work together on common projects. The first SESAME Cultural Heritage Day took place online on 16 February 2022 with more than 240 registrants in 39 countries (CERN Courier July/August 2022 p19). 

Powered by renewable energy

Thanks to the help of the EU, SESAME has also become the world’s first “green” light source, its energy entirely generated by solar power, which also has the bonus of stabilising the energy bill of the machine. There is, however, concern that the only component used from BESSY, the “Microtron” radio-frequency system, may eventually break down, thus endangering the operation of the whole machine. 

SESAME continues to operate on a shoe-string budget. The current approved 2022 budget is about $5.3 million, much smaller than that of any modern light source. I marvel at the ingenuity of the SESAME staff allowing the facility to operate, and am sad to sense indifference to the budget among many of the parties involved. The world’s media has been less indifferent: the BBC, The New York Times, Le Monde, The Washington Post, Brussels Libre, The Arab Weekly, as well as regional newspapers and TV stations, have all covered various aspects of SESAME. In 2019 the AAAS highlighted the significance of SESAME by awarding five of its founders (Chris Llewellyn Smith, Eliezer Rabinovici, Zehra Sayers, Herwig Schopper and Khaled Toukan) with its 2019 Award for Science Diplomacy. 

SESAME was inspired by CERN, yet it was a much more challenging task to construct. CERN was built after the Second World War was over, and it was clear who had won and who had lost. In the Middle East the conflicts are not over, and there are different narratives on who is winning and who is losing, as well as what win or lose means. For CERN it took less than 10 years to set up the original construct; for SESAME it took about 25 years. Thus, SESAME now should be thought of as CERN was in around 1960.

On a personal note, it brings immense happiness that for the first time ever, Israeli scientists have carried out high-quality research at a facility established on the soil of an Arab country, Jordan. Many in the region and beyond have taken their people to a place their governments most likely never dreamt of or planned to reach. It is impossible to give due credit to the many people without whom SESAME would not be the success it is today. 

The non-relations among several of the members presented a serious challenge

In many ways SESAME is a very special child of CERN, and often our children can teach us important lessons. As president of the CERN Council, I can say that the way in which the member states of SESAME conducted themselves during the decades of storms that affect our region serves as a benchmark for how to keep bridges for understanding under the most trying of circumstances. The SESAME spirit has so far been a lighthouse even to the CERN Council, in particular in light of the invasion of Ukraine (an associate member state of CERN) by the Russian Federation. Maintaining this attitude in a stormy political environment is very difficult. 

However SESAME’s story ends, we have proved that the people of the Middle East have within them the capability to work together for a common cause. Thus, the very process of building SESAME has become a beacon of hope to many in our region. The responsibility of SESAME in the next years is to match this achievement with high-quality scientific research, but it requires appropriate funding and help. SESAME is continuing very successfully with its mission to train hundreds of engineers and scientists in the region. Requests for beam time continue to rise, as do the number of publications in top journals. 

If one wants to embark on a scientific project to promote peaceful understanding, SESAME offers at least three important lessons: it should be one to which every country can contribute, learn and profit significantly from; its science should be of the highest quality; and it requires an unbounded optimism and an infinite amount of enthusiasm. My dream is that in the not-so-distant future, people will be able to point to a significant discovery and say “this happened at SESAME”.

The post From dreams to beams: SESAME’s 30 year-long journey in science diplomacy appeared first on CERN Courier.

]]>
Feature SESAME founder Eliezer Rabinovici describes the story behind this beacon for peaceful international collaboration, what its achievements have been, and what the future holds. https://cerncourier.com/wp-content/uploads/2023/01/CCJanFeb23_SESAME_frontis1.jpg
The Sketchbook and the Collider https://cerncourier.com/a/the-sketchbook-and-the-collider/ Fri, 16 Dec 2022 15:37:11 +0000 https://preview-courier.web.cern.ch/?p=107381 Annecy exhibition connected the artist’s sketchbook and the physicist’s collider as arenas where the invisible is made visible.

The post The Sketchbook and the Collider appeared first on CERN Courier.

]]>
sketchbook_and_collider

“Reality is not what it seems: Drawing links between fine art and particle physics” was the title of the art–science exhibit set up by the Laboratoire d’Annecy de Physique des Particules (LAPP) on the occasion of the 2022 Fête de la Science. The installation was part of an ongoing collaboration between UK fine artist Ian Andrews and ATLAS physicist Kostas Nikolopoulos called “The Sketchbook and the Collider”, which was initiated in 2018 while Andrews was an artist in residence at the University of Birmingham. The project takes viewers on a journey where the artist’s sketchbook and the experimental physicist’s collider can both be seen as arenas where the invisible is made visible, “sometimes violently”, by bringing elements together and examining the traces of hidden interactions. It also comprises performative pieces that involve “live” drawing and the cooperation, participation and interaction of artists, scientists and members of the public.

“About 900 visitors spanning all ages, professions and cultural backgrounds engaged and interacted with the exhibition, either attracted by the arts and the science, or caught by surprise on their way to Lake Annecy, in a very special Higgs and LHC celebration year,” said organiser Claire Adam-Bourdarios of LAPP.

The post The Sketchbook and the Collider appeared first on CERN Courier.

]]>
Review Annecy exhibition connected the artist’s sketchbook and the physicist’s collider as arenas where the invisible is made visible. https://cerncourier.com/wp-content/uploads/2022/12/sketchbook_and_collider_ft_image-1.jpg
SLAC at 60: past, present, future https://cerncourier.com/a/slac-at-60-past-present-future/ Thu, 08 Dec 2022 20:18:19 +0000 https://preview-courier.web.cern.ch/?p=107279 Available to watch now, JoAnne Hewett, SLAC National Accelerator Laboratory, celebrates 60 years of SLAC.

The post SLAC at 60: past, present, future appeared first on CERN Courier.

]]>
By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

This year, SLAC celebrates its remarkable past while continuing its quest for a bright future. This presentation takes a look at how it all started with the lab’s two-mile-long linear accelerator and accompanying groundbreaking discoveries in particle physics; explores how the lab’s scientific mission has evolved over time to include many disciplines ranging from X-ray science to cosmology; and discusses the most exciting perspectives for future research, from developing new quantum technology to pushing the frontiers of our understanding of the universe on its largest scales.

Want to learn more on this subject?

JoAnne Hewett is a world-class theoretical physicist with well over 100 publications in theoretical high-energy physics. Her research probes the fundamental nature of space, matter and energy, where she most enjoys devising experimental tests for preposterous theoretical ideas. She is best known for her work on the possible existence of extra spatial dimensions. She has twice been a member of the HEPAP advisory panel and made major contribution to the recent Particle Physics Project Prioritization Panel (“P5”) plan, which defines US high-energy physics research priorities for the next 10 years.

Since joining the SLAC faculty in 1994, JoAnne has served in key leadership roles here at SLAC, including head of the theoretical physics group, deputy director of the Science Directorate and Director of SLAC’s Elementary Particle Physics (EPP) Division. During her tenure as EPP Division director, JoAnne aligned the program with the highest P5 priorities by establishing a neutrino theory program and extending SLAC’s experimental efforts work in accelerator-based neutrino physics and neutrinoless double-beta decay. She was elected a fellow of the American Physical Society in 2008 and named a fellow of the American Association for the Advancement of Science in 2009, and served as chair of the American Physical Society’s Division of Particles & Fields in 2016.





 

The post SLAC at 60: past, present, future appeared first on CERN Courier.

]]>
Webinar Available to watch now, JoAnne Hewett, SLAC National Accelerator Laboratory, celebrates 60 years of SLAC. https://cerncourier.com/wp-content/uploads/2022/11/2022-01-17-webinar-image.jpg
Hymn to HERMES https://cerncourier.com/a/hymn-to-hermes/ Tue, 06 Sep 2022 16:25:09 +0000 https://preview-courier.web.cern.ch/?p=106257 The former HERMES experiment at DESY, a pioneer in unravelling the mysteries of the proton spin, is the protagonist in a new book by Richard Milner and Erhard Steffens.

The post Hymn to HERMES appeared first on CERN Courier.

]]>
HERMES detector

One hundred years ago, Otto Stern and Walther Gerlach performed their ground-breaking experiment shooting silver atoms through an inhomogeneous magnetic field, separating them according to their spatially quantised angular momentum. It was a clear victory of quantum theory over the still widely used classical picture of the atom. The results also paved the way to the introduction of the concept of spin, an intrinsic angular momentum, as an inherent property of subatomic particles. 

The idea of spin was met with plenty of scepticism. Abraham Pais noted in his book George Uhlenbeck and the Discovery of Electron Spin that Ralph Kronig finishing his PhD at Columbia University in 1925 and travelling through Europe, introduced the idea to Heisenberg and Pauli, who dryly commented that “it is indeed very clever but of course has nothing to do with reality”. Feeling ridiculed, Kronig dropped the idea. A few months later, still against strong resistance by established experts but this time with sufficient backing by their mentor Paul Ehrenfest, Leiden graduate-students George Uhlenbeck and Samuel Goudsmit published their seminal Nature paper on the “spinning” electron. “In the future I shall trust my own judgement more and that of others less,” wrote Kronig in a letter to Hendrik Kramers in March 1926.

Spin crisis

Spin quickly became a cornerstone of 20th-century physics. Related works of paramount importance were Pauli’s exclusion principle and Dirac’s description of relativistic spin-1/2 particles, as well as the spin-statistics theorems (namely the Fermi–Dirac and Bose–Einstein distributions for identical half-integer–spin and integer–spin particles, respectively). But more than half a century after its introduction, spin re-emerged as a puzzle. By then, a rather robust theoretical framework, the Standard Model, had been established within which many precision calculations became a comfortable standard. It could have been all that simple: since the proton consists of two valence-up and one valence-down quarks, with spin up and down (i.e. parallel and antipara­llel to the proton’s spin, respectively), the origin of its spin is easily explained. The problem dubbed “spin crisis” arose in the late 1980s, when the European Muon Collaboration at CERN found that the contribution of quarks to the proton spin was consistent with zero, within the then still-large uncertainties, and that the so-called Ellis–Jaffe sum rule – ultimately not fundamental but model-dependent – was badly violated. What had been missed?

Today, after decades of intense experimental and theoretical work, our picture of the proton and its spin emerging from high-energy interactions has changed substantially. The role of gluons both in unpolarised and polarised protons is non-trivial. More importantly, transverse degrees of freedom, both in position and momentum space, and the corresponding role of orbital angular momentum, have become essential ingredients in the modern description of the proton structure. This description goes beyond the picture of collinearly moving partons encapsulated by the fraction of the parent proton’s momentum and the scale at which they are probed; numerous effects, unexplainable in the simple picture, have now become theoretically accessible.

Understanding the mysteries 

The HERMES experiment at DESY, which operated between 1995 and 2007, has been a pioneer in unravelling the mysteries of the proton spin, and the experiment is the protagonist in a new book by Richard Milner and Erhard Steffens, two veterans in this field as well as the driving forces behind HERMES. The subtitle and preface clarify that this is a personal account and recollection of the history of HERMES, from an emergent idea on both sides of the Atlantic to a nascent collaboration and experiment, and finally as an extremely successful addition to the physics programme of HERA (the world’s only lepton–proton collider, which started running at DESY 30 years ago for one and a half decades). 

Milner and Steffens are both experts on polarised gas targets, with complementary backgrounds leading to rather different perspectives. Indeed, HERMES was independently developed within a North American initiative, in which Milner was the driving force, and a European initiative around the Heidelberg MPI-K led by Klaus Rith, with Erhard Steffens as a long-time senior group member. In 1988 two independent letters of intent submitted to DESY triggered sufficient interest in the idea of a fixed-target experiment with a polarised gas target internal to the HERA lepton ring; the proponents were subsequently urged to collaborate in submitting a common proposal. In the meantime, HERMES’ feasibility needed to be demonstrated. A sufficiently high lepton-polarisation had to be established, as well as smooth running of a polarised gas target in the harsh HERA environment without disturbing the machine and the main HERA experiments H1 and Zeus. 

By summer 1993, HERMES was fully approved, and in 1995 the data taking started with polarised 3He. The subsequently used target of polarised hydrogen or deuterium employed the same concepts that Stern and Gerlach had already used in their famous experiment. The next decade saw several upgrades and additions to the physics programme, and data taking continued until summer 2007. In all those years, the backbone of HERMES was an intense and polarised lepton beam that traversed a target of pure gas in a storage cell, highly polarised or unpolarised, avoiding extensive and in parts model-dependent corrections. This constellation was combined with a detector that, from the very beginning, was designed to not only detect the scattered leptons but also the “spray” produced in coincidence. These features allowed a diverse set of processes to be studied, leading to numerous pioneering measurements and insights that motivated, and continue to motivate, new experimental programmes around the world, including some at CERN.

Richard Milner and Erhard Steffens provide extensive insights, in particular into the historic aspects of HERMES, which are difficult to obtain elsewhere. The book gives an insightful discussion of the installation of the experiment and of the outstanding efforts of a group of highly motivated and dedicated individuals who worked too often in complete ignorance of (or in defiance of) standard working hours. Their account enthrals the reader with vivid anecdotes, surprising twists and personal stories, all told in a colloquial style. While clearly not meant as a textbook – indeed, one might notice small mistakes and inconsistencies in a few places – this book makes for worthwhile and enjoyable reading, not only for people familiar with the subject but equally for outsiders. In particular, younger generations of physicists working in large-scale collaborations might be surprised to learn that it needs only a small group and little time to start an experiment that goes on to have a tremendous impact on our understanding of nature’s basic constituents.

The post Hymn to HERMES appeared first on CERN Courier.

]]>
Review The former HERMES experiment at DESY, a pioneer in unravelling the mysteries of the proton spin, is the protagonist in a new book by Richard Milner and Erhard Steffens. https://cerncourier.com/wp-content/uploads/2022/09/CCSepOct22_REVS_hermes.jpg
Parallels https://cerncourier.com/a/parallels/ Tue, 06 Sep 2022 16:22:12 +0000 https://preview-courier.web.cern.ch/?p=106272 For the more inquisitive viewer, Parallels offers a chance to explore the often even more incredible ideas explored for real in particle physics.

The post Parallels appeared first on CERN Courier.

]]>
Parallels

Released in March 2022 on Disney+, Parallels merges two of the most popular concepts in science fiction: time travel and the multiverse. The series, in French, created by Quoc Dan Trang and directed by Benjamin Rocher and Jean-Baptiste Saurel, is set in a village in the mountains of the French–Swiss border where a particle-physics laboratory called “ERN” and a collider strongly resembling the LHC have a major role.

The story begins with a group of four friends who recently graduated from middle school celebrating one of their birthdays near an area where, 10 years earlier, a kid called Hugo disappeared. At the same time, ERN is performing an experiment with its particle accelerator. However, something goes wrong. The lights go out in the village, while a strange space–time phenomenon unfolds, transporting the teenagers to different timelines once the lights are restored. Does this have anything to do with the particle accelerator? Where, or rather “when” are they? Each of the teenagers tries to unravel their temporal confusion in an attempt to return to their original timeline.

Parallels offers a chance to go beyond fiction and explore the often even more incredible ideas explored for real in particle physics

Although the age of the main characters targets younger audiences, Parallels addresses topics such as depression, regret and family issues, which, combined with some humour, make it relevant to other age groups. The visual effects and music create a suspenseful atmosphere and the compact nature of the series (six episodes of around 35 minutes each) draws the viewer into watching it in a single session. 

CERN’s experiments and locations are referenced several times throughout, ranging from visual details in the ERN buildings to mentions of ATLAS, CMS and the Antiproton Decelerator – going so far as to reference an “FCC scheduled for operations in October 2025”. The Globe of Science and Innovation and the CMS silicon tracker are also represented. 

Many of the concepts introduced, especially those related to the LHC experiments, are not scientifically accurate. The clear depiction of CERN in all but name may also make some physicists feel uncomfortable, given that the plot plays on YouTube-based conspiracy theories about what CERN’s experiments are capable of. For young science-fiction lovers, however, and especially for those who love to unravel temporal paradoxes, as in the popular Netflix series Stranger Things, Parallels is worth a look. For the more inquisitive and open-minded viewer, it also offers a chance to go beyond fiction and explore the often even more incredible ideas explored for real in particle physics. 

The post Parallels appeared first on CERN Courier.

]]>
Review For the more inquisitive viewer, Parallels offers a chance to explore the often even more incredible ideas explored for real in particle physics. https://cerncourier.com/wp-content/uploads/2022/09/CCSepOct22_REVS_paralells.jpg
Introduction to the Standard Model and Beyond https://cerncourier.com/a/introduction-to-the-standard-model-and-beyond/ Thu, 14 Jul 2022 16:12:28 +0000 https://preview-courier.web.cern.ch/?p=102368 Stuart Raby's book is a useful resource for designing lectures in quantum field theory and physics beyond the Standard Model, writes Martin Bauer.

The post Introduction to the Standard Model and Beyond appeared first on CERN Courier.

]]>
Introduction to the Standard Model and Beyond

Stuart Raby has written a modern, comprehensive textbook on quantum field theory, the Standard Model (SM) and its possible extensions. The focus of the book is on symmetries, and it contains a wealth of recent experimental results on Higgs and neutrino physics, which sets it apart from other textbooks addressing the same audience. It is published at a time when the incredible success story of the SM has come to a close with the discovery of the Higgs boson, and when the upcoming neutrino experiments promise to probe beyond-the-SM physics.

Raby is the author of some of the most important papers on supersymmetric grand unified theories and the book reflects that. It is no easy task to cover such a wide range of topics, from the basics of group theory to very advanced concepts such as gauge and gravity-mediated supersymmetry breaking, in one book. Raby devotes 120 pages to the basics of group theory, representations of the Poincaré group and the construction of the S matrix to provide the necessary foundations for the introduction to quantum electrodynamics in part III. Parts IV–VI introduce the reader to discrete symmetries, flavour symmetries and spontaneous symmetry breaking. Next, Raby describes two “Roads to the Standard Model” following the development of quantum chromodynamics and of electroweak theory, before arriving at the SM in part IX. The remaining parts deal with neutrino physics, grand unified theories and the minimal supersymmetric SM. 

There are no omissions topic-wise, which makes the book very comprehensive. This comes at a price, however. In several places, complicated topics are discussed with only the most minimal of context, reading like a collection of equations rather than a textbook. Two examples of this are the discussion of causality for fermionic fields or the step from global to local supersymmetry, to which the author devotes only half a page each. In other places, more cross-referencing would improve legibility. For example, the chapter on SU(5) grand unified theory does not mention the automatic cancellation of gauge anomalies, a topic previously introduced in the context of the SM.

The use of materials is very distinctive. I doubt there is another book on the market that presents the reader with such a wealth of plots, figures and sketches, including recent experimental results on all the important topics discussed. The most important plots are reproduced in 12 pages of colour tables in the centre. There are exercises for the first five parts and a single Mathematica notebook is printed for Wigner rotations. Another distinguishing feature are the detailed suggested projects to use during a two-term course based on the book.

A very useful resource for designing a lecture of quantum field theory and beyond-the-SM physics

Although advertised as useful for both theorists and experimentalists, it is undeniably a book written from a theorist’s perspective. This becomes most clear in the latter parts, where relevant sections of the plots presenting experimental results remain unexplained. That being said, other very important experimental topics are explained, which you will not find in other textbooks about the SM. Raby explains how the anomalous magnetic moments of the electron and the muon are measured, and goes into quite some detail on neutrino experiments. 

The book would benefit from improved editing. For example, the units are sometimes in italics, sometimes not, some equations are double tagged, some plots do not have axes labels, and there is inconsistent use of wavy and curly lines in the Feynman diagrams. Raby does make good use of references though, and points the reader to other textbooks and original literature; although the index needs to be extended significantly to be useful.

I recommend this book for advanced undergraduates, graduate students and lecturers. It provides a very useful resource for designing a lecture of quantum field theory and beyond-the-SM physics, and the amount of material covered is impressive and comprehensive. Beginners might be overwhelmed by Raby’s compact style , so I would recommend those who are new to quantum field theory to read a more accessible textbook in parallel.

The post Introduction to the Standard Model and Beyond appeared first on CERN Courier.

]]>
Review Stuart Raby's book is a useful resource for designing lectures in quantum field theory and physics beyond the Standard Model, writes Martin Bauer. https://cerncourier.com/wp-content/uploads/2022/07/CCJulAug22_REV-Raby_feature.jpg
CERN Courier’s Higgstory https://cerncourier.com/a/cern-couriers-higgstory/ Fri, 01 Jul 2022 17:26:20 +0000 https://preview-courier.web.cern.ch/?p=102250 The CERN Courier editors take a tour through the magazine's Higgs archives.

The post CERN Courier’s Higgstory appeared first on CERN Courier.

]]>
higgstory_collage

It was March 1977 when the hypothetical Higgs boson first made its way onto the pages of this magazine. Reporting on a talk by Steven Weinberg at the Chicago Meeting of the American Physical Society, the editors noted the dramatic success of gauge theories in explaining recent discoveries at the time — beginning with the observation of the neutral current at CERN in 1973 and the “new physics” following the J/ψ discovery at Brookhaven and Stanford the following year, observing: “The theories also postulate a set of scalar particles in a similar mass range… If Higgs bosons exist, they will affect particle behaviour at all energies. However, their postulated interactions are even weaker than the normal weak interactions. The effects would only be observable on a very small scale and would usually be drowned out by the stronger interactions.”

vol19-issue9-p395figa

The topic clearly drew the attention of readers, as just a few issues later, in September 1977, the editors delved deeper into the origins of the Higgs boson and its role in spontaneous symmetry breaking, offering Abdus Salam’s “personal picture” to communicate this abstruse concept: “Imagine a banquet where guests sit at round tables. A bird’s eye view of the scene presents total symmetry, with serviettes alternating with people around each table. A person could equally well take a serviette from his right or from his left. The symmetry is spontaneously broken when one guest decides to pick up from his left and everyone else follows suit.”

Within a year, CERN Courier was on the trail of how the Higgs boson might show itself experimentally. Reporting on a “Workshop on Producing High Luminosity Proton–Antiproton Storage Rings” held at Berkeley, the April 1978 issue stated: “As well as the intermediate boson, the proton–antiproton colliders could give the first signs of the Higgs parti­cles or of other unexpected states. While the discovery of weak neutral currents and charm provided impres­sive evidence for the gauge-theory pic­ture that unifies electromagnetic and weak interactions, one prediction of this picture is the existence of spinless Higgs bosons. If these are not found at higher energies, some re-thinking might be required.” In the December 1978 issue, with apologies to Neil Armstrong, the Courier ran a piece titled “A giant LEP for mankind”. The hope was that with LEP, physicists had the tool to explore in depth the details of the symmetry breaking mechanism at the heart of weak interaction dynamics.

vol18-issue12-p434fig

The award of the 1979 Nobel Prize in Physics to Weinberg, Glashow and Salam for the electroweak theory received full coverage in December that year, with the Courier expressing confidence in the Higgs: “Another vital ingredient of the theory which remains to be tested are the Higgs particles of the spon­taneous symmetry breaking me­chanism. Here the theory is still in a volatile state and no firm predictions are possible. But this mechanism is crucial to the theory, and something has to turn up.”

A Higgs for the masses

To many people, wrote US theorist Sam Treiman in November 1981, the Higgs particle looks somewhat artifi­cial — “a kind of provisional stand-in for deeper effects at a more funda­mental level”. Four years later, “with several experiments embark­ing on fresh Higgs searches”, Ri­chard Dalitz and Louis Lyons organised a neatly titled workshop “Higgs for the masses” to review the theoretical and experimental status. Another oddity of the Higgs, wrote Lyons, is that unless it is very light (less than 10–17 eV), the Higgs should make the uni­verse curved, “contributing more to the cosmological constant than the known limit permits”. Lower limits (from spontaneous sym­metry breaking) and higher limits (from the unitarity requirement) open up a wide range of masses for the Higgs to man­oeuvre — between 7 and 1000 GeV, he noted. “From time to time, new ‘bumps’ and effects are tentatively put for­ward as candidate Higgs, but so far none are convincing.”

LEP’s electroweak adventure reached a dramatic climax in the summer of 2000, with hints that a light Higgs boson was showing itself. In October, the machine was granted a stay of Higgs execution. Alas, the signal faded, and the final curtain fell on LEP in November — a “LEPilogue” heralding the beginning of a new era: the LHC.

Discussions about a high-energy hadron collider were ongoing long before: ICFA’s Future Perspectives meeting at Brookhaven in October 1987 noted two major hadron collider pro­jects on the market: “the US Superconducting Supercollider, with collision energies of 40 TeV in an 84 kilometre ring, and the CERN Large Hadron Collider, with up to 17 TeV colli­sion energies”. In December 1994, shortly after CERN turned 40, Council provided the lab with “The ultimate birthday present“: the unanimous approval of the LHC. A quarter of a century later, the LHC started up and brought particle physics to the world.

vol35-issue1-p001fig

Together with LEP data, Fermilab’s CDF and DØ experiments and the LHC 2011 measurement campaign narrowed down the possible mass range for the Higgs boson to be between 115 and 127 GeV. First tantalising hints of the Higgs boson were presented on 13 December 2011. The quest remained open for another half a year, until Director-General Rolf Heuer, following the famous talks by ATLAS and CMS spokespersons Fabiola Gianotti and Joe Incandela, concluded: “As a layman I would say: I think we have it” on 4 July 2012. It was a day to remember: a breakthrough discovery rooted in decades of work by thousands of individuals that rocked the CERN auditorium and reverberated around the world. A new chapter in particle physics had begun…

To mark the 10th anniversary of this momentous event, from Monday 4 July the Courier will be exploring the theoretical and experimental effort behind the Higgs-boson discovery, the immense progress made by ATLAS and CMS in our understanding of this enigmatic particle, and the deep connections between the Higgs boson and some of the most profound open questions in fundamental physics.

Wherever the Higgs boson leads, CERN Courier  will be there to report!

The post CERN Courier’s Higgstory appeared first on CERN Courier.

]]>
Feature The CERN Courier editors take a tour through the magazine's Higgs archives. https://cerncourier.com/wp-content/uploads/2022/07/higgstory_collage.png
Stepping into the spotlight https://cerncourier.com/a/stepping-into-the-spotlight/ Fri, 01 Jul 2022 15:00:13 +0000 https://preview-courier.web.cern.ch/?p=101167 In an excerpt from his new book Elusive: How Peter Higgs Solved the Mystery of Mass, Frank Close recounts the story of the 2013 Nobel Prize in Physics.

The post Stepping into the spotlight appeared first on CERN Courier.

]]>
François Englert and Peter Higgs

With the boson confirmed, speculation inevitably grew about the 2012 Nobel Prize in Physics. The prize is traditionally announced on the Tuesday of the first full week in October, at about midday in Stockholm. As it approaches, a highly selective epidemic breaks out: Nobelitis, a state of nervous tension among scientists who crave Nobel recognition. Some of the larger egos will have previously had their craving satisfied, only perhaps to come down with another fear: will I ever be counted as one with Einstein? Others have only a temporary remission, before suffering a renewed outbreak the following year.

Three people at most can share a Nobel, and at least six had ideas like Higgs’s in the halcyon days of 1964 when this story began. Adding to the conundrum, the discovery of the boson involved teams of thousands of physicists from all around the world, drawn together in a huge cooperative venture at CERN, using a machine that is itself a triumph of engineering. 

The 2012 Nobel Prize in Physics was announced on Tuesday 9 October and went to Serge Haroche and David Wineland for taking the first steps towards a quantum computer. Two days later, I went to Edinburgh to give a colloquium and met Higgs for a coffee beforehand. I asked him how he felt now that the moment had passed, at least for this year. “I’m enjoying the peace and quiet. My phone hasn’t rung for two days,” he remarked. 

That the sensational discovery of 2012 was indeed of Higgs’s boson was, by the summer of 2013, beyond dispute. That Higgs was in line for a Nobel prize also seemed highly likely. Higgs himself, however, knew from experience that in the Stockholm stakes, nothing is guaranteed. 

Back in 1982, at dawn on 5 October in the Midwest and the eastern US, preparations were in hand for champagne celebrations in three departments at two universities. At Cornell, the physics department hoped they would be honouring Kenneth Wilson, while over in the chemistry department their prospect was Michael Fisher. In Chicago, the physicists’ hero was to be Leo Kadanoff. Two years earlier the trio had shared the Wolf Prize, the scientific analogue of the Golden Globes to the Nobel’s Oscars, for their work on critical phenomena connected with phase transitions, fuelling speculation that a Nobel would soon follow. At the appointed hour in Stockholm, the chair of the awards committee announced that the award was to Wilson alone. The hurt was especially keen in the case of Michael Fisher, whose experience and teaching about phase transitions, illuminating the subtle changes in states of matter such as melting ice and the emergence of magnetism, had inspired Wilson, five years his junior. The omission of Kadanoff and Fisher was a sensation at the time and has remained one of the intrigues of Nobel lore.

Fisher’s agony was no secret to Peter Higgs. As undergraduates they had been like brothers and remained close friends for more than 60 years. Indeed, Fisher’s influence was not far away in July 1964, for it was while examining how some ideas from statistical mechanics could be applied to particle physics that Higgs had the insight that would become the capstone to the theory of particles and forces half a century later. For this he was to share the 2004 Wolf Prize with Robert Brout (who sadly died in 2011) and François Englert – just as Fisher, Kadanoff and Wilson had shared this prize in 1980. Then as October approached in 2013 Higgs became a hot favourite at least to share the Nobel Prize in Physics, and the bookmakers would only take bets at extreme odds-on. 

Time to escape 

In 2013, 8 October was the day when the Nobel decision would be announced. Higgs’s experiences the year before had helped him to prepare: “I decided not to be at home when the announcement was made with the press at my door; I was going to be somewhere else.” His first plan was to disappear into the Scottish Highlands by train, but he decided it was too complicated, and that he could hide equally well in Edinburgh. “All I would have to do is go down to Leith early enough. I knew the announcement would be around noon so I would leave home soon after 11, giving myself a safe margin, and have an early lunch in Leith about noon.” 

ATLAS and CMS physicists in Building 40 on 8 October 2013

Richard Kenway, the Tait Professor of Mathematical Physics at Edinburgh and one of the university’s vice principals, confirmed the tale. “That was what we were all told, and he completely convinced us. Right up to the actual moment when we were sitting waiting for the [Nobel] announcement, we thought he had disappeared off somewhere into the Highlands.” Some newspapers got the fake news from the department, and one reporter even went up into the Highlands to look for him.

As scientists and journalists across the world were glued to the live broadcast, the Nobel committee was still struggling to reach the famously reclusive physicist. The announcement of his long-awaited crown was delayed by about half an hour until they decided they could wait no longer. Meanwhile, Peter Higgs sat at his favourite table in The Vintage, a seafood bar in Henderson Street, Leith, drinking a pint of real ale and considering the menu. As the committee announced that it had given the prize to François Englert and Peter Higgs “for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider”, phones started going off in the Edinburgh physics department. 

Higgs finished his lunch. It seemed a little early to head home, so he decided to look in at an art exhibition. At about three o’clock he was walking along Heriot Row in Edinburgh, heading for his flat nearby, when a car pulled up near the Queen Street Gardens. “A lady in her 60s, the widow of a high-court judge, got out and came across the road in a very excited state to say, ‘My daughter phoned from London to tell me about the award’, and I said, ‘What award?’ I was joking of course, but that’s when she confirmed that I had won the prize. I continued home and managed to get in my front door with no more damage than one photographer lying in wait.” It was only later that afternoon that he finally learned from the radio news that the award was to himself and Englert. 

Suited and booted 

On arrival in Stockholm in December 2013, after a stressful two-day transit in London, Higgs learned that one of the first appointments was to visit the official tailor. The costume was to be formal morning dress in the mid-19th-century style of Alfred Nobel’s time, including elegant shoes adorned with buckles. As Higgs recalled, “Getting into the shirt alone takes considerable skill. It was almost a problem in topology.” The demonstration at the tailor’s was hopeless. Higgs was tense and couldn’t remember the instructions. On the day of the ceremony, fortunately, “I managed somehow.” Then there were the shoes. The first pair were too small, but when he tried bigger ones, they wouldn’t fit comfortably either. He explained, “The problem is that the 19th-century dress shoes do not fit the shape of one’s foot; they were rather pointy.” On the day of the ceremony both physics laureates had a crisis with their shoes. “Englert called my room: ‘I can’t wear these shoes. Can we agree to wear our own?’ So we did. We were due to be the first on the stage and it must have been obvious to everyone in the front row that we were not wearing the formal shoes.” 

Robert Brout in spirit completed a trinity of winners

On the afternoon of 10 December, nearly 2000 guests filled the Stockholm Concert Hall to see 12 laureates receive their awards from King Gustav of Sweden. They had been guided through the choreography of the occasion earlier, but on the day itself, performing before the throng in the hall, there would be first-night nerves for this once-in-a-lifetime theatre. Winners of the physics prize would be called to receive their awards first, while the others watched and could see what to expect when they were named. The scenery, props and supporting cast were already in place. These included former winners dressed in tail suits and proudly wearing the gold button stud that signifies their membership of this unique club. Among them were Carlo Rubbia, discoverer of the W and Z particles, who instigated the experimental quest for the boson and won the prize in 1984; Gerard ’t Hooft, who built on Higgs’s work to complete the theoretical description of the weak nuclear force and won in 1999; and 2004 winner Frank Wilczek, who had built on his own prize-winning work to identify the two main pathways by which the Higgs boson had been discovered.

Peter Higgs in July 2012

After a 10-minute oration by the chair of the Nobel Foundation and a musical interlude, Lars Brink, chairman of the Nobel Committee for Physics, managed to achieve one of the most daunting challenges in science pedagogy, successfully addressing both the general public in the hall and the assembled academics, including laureates from other areas of science. The significance of what we were celebrating was beyond doubt: “With discovery of the Higgs boson in 2012, the Standard Model of physics was complete. It has been proved that nature follows precisely that law that Brout, Englert and Higgs created. This is a fantastic triumph for science,” Brink announced. He also introduced a third name, that of Englert’s collaborator, Robert Brout. In so doing, he made an explicit acknowledgement that Brout in spirit completed a trinity of winners. 

Brink continued with his summary history of how their work and that of others established the Standard Model of particle physics. Seventeen months earlier the experiments at the LHC had confirmed that the boson is real. What had been suspected for decades was now confirmed forever. The final piece in the Standard Model of particle physics had been found. The edifice was robust. Why this particular edifice is the one that forms our material universe is a question for the future. Brink now made the formal invitation for first Englert and then Higgs to step forward to receive their share of the award.

Higgs, resplendent in his formal suit, and comfortable in his own shoes, rose from his seat and prepared to walk to centre-stage. Forty-eight years since he set out on what would be akin to an ascent of Everest, Higgs had effectively conquered the Hillary step – the final challenge before reaching the peak – on 4 July 2012 when the existence of his boson was confirmed. Now, all that remained while he took nine steps to reach the summit was to remember the choreography: stop at the Nobel Foundation insignia on the carpet; shake the king’s hand with your right hand while accepting the Nobel prize and diploma with the other. Then bow three times, first to the king, then to the bust of Alfred Nobel at the rear of the stage, and finally to the audience in the hall.

Higgs successfully completed the choreography and accepted his award. As a fanfare of trumpets sounded, the audience burst into applause. Higgs returned to his seat. The chairman of the chemistry committee took the lectern to introduce the winners of the chemistry prize. To his relief, Higgs was no longer in the spotlight.

All in a name 

The saga of Higgs’s boson had begun with a classic image – a lone genius unlocking the secrets of nature through the power of human thought. The fundamental nature of Higgs’s breakthrough had been immediately clear to him. However, no one, least of all Higgs, could have anticipated that it would take nearly half a century and several false starts to get from his idea to a machine capable of finding the particle. Nor did anyone envision that this single “good idea” would turn a shy and private man into a reluctant celebrity, accosted by strangers in the supermarket. Some even suggested that the reason why the public became so enamoured with Higgs was the solid ordinariness of his name, one syllable long, unpretentious, a symbol of worthy Anglo-Saxon labour. 

lusive: How Peter Higgs Solved the Mystery of Mass

In 2021, nine years after the discovery, we were reminiscing about the occasion when, to my surprise, Higgs suddenly remarked that it had “ruined my life”. To know nature through mathematics, to see your theory confirmed, to win the plaudits of your peers and join the exclusive club of Nobel laureates: how could all this equate with ruin? To be sure I had not misunderstood, I asked again the next time we spoke. He explained: “My relatively peaceful existence was ending. I don’t enjoy this sort of publicity. My style is to work in isolation, and occasionally have a bright idea.”   

  • This is an edited extract from Elusive: How Peter Higgs Solved the Mystery of Mass, by Frank Close, published on 14 June (Basic Books, US) and 7 July (Allen Lane, UK)

The post Stepping into the spotlight appeared first on CERN Courier.

]]>
Feature In an excerpt from his new book Elusive: How Peter Higgs Solved the Mystery of Mass, Frank Close recounts the story of the 2013 Nobel Prize in Physics. https://cerncourier.com/wp-content/uploads/2022/06/CCJulAug22_Elusive-EnglertHiggs.jpg
Form follows function in QCD https://cerncourier.com/a/form-follows-function-in-qcd/ Wed, 09 Mar 2022 10:59:31 +0000 https://preview-courier.web.cern.ch/?p=97703 Hadron form factors: From Basic Phenomenology to QCD Sum Rules is a valuable reference work, writes Amanda Cooper-Sarkar.

The post Form follows function in QCD appeared first on CERN Courier.

]]>
Hadron from factors

In the 1970s, the study of low-energy (few GeV) hadron–hadron collisions in bubble chambers was all the rage. It seemed that we understood very little. We had the SU(3) of flavour, Regge theory and the S-matrix to describe hadronic processes, but no overarching theory. Of course, theorists were already working on perturbative QCD and this started to gain traction when experimental results from the Big European Bubble Chamber at CERN showed signs of the scaling violations and made an early measurement of the QCD scale, ΛQCD. We have been living with the predictions of perturbative QCD ever since, at increasingly higher orders. But there have always been non-perturbative inputs, such as the parton distribution functions.

Hadron Form Factors: From Basic Phenomenology to QCD Sum Rules takes us back to low-energy hadron physics and shows us how much more we know about it today. In particular, it explores the formalism for heavy-flavour decays, which is particularly relevant at a time when it seems that the only anomalies we observe with respect to the Standard Model appear in various B-meson decays. It also explores the connections between space-like and time-like processes in terms of QCD sum rules connecting perturbative and non-perturbative behaviour.

The book takes us back to low-energy hadron physics and shows us how much more we know about it today

The general introduction reminds us of the formalism of form factors in the atomic case. This is generalised to mesons and baryons in chapters 2 and 3, after the introduction of QCD in chapter 1, with an emphasis on quark and gluon electroweak currents and their generalisation to effective currents. Hadron spectroscopy is reviewed from a modern perspective and heavy-quark effective theory is introduced. In chapter 2, the formalism for the pion form factor, which is related to the pion decay constant, is introduced via e-π scattering. Due emphasis is placed on how one may measure these quantities. I also appreciated the explanation of how a pseudoscalar particle such as the pion can decay via the axial vector current – a question

hadron_form_review

often raised by smart undergraduates. (Clue: the axial vector current is not conserved). Next, the πe3 decay is considered and generalised to K-, D- and B-meson semileptonic decays. Chapter 3 covers the baryon form factors and their decay constants, and chapter 4 considers hadronic radiative transitions. Chapter 5 relates the pion form factor in the space-like region to its counterpart in the time-like region in e+e → π+π, where one has to consider resonances and widths. Relationships are developed, whereby one can see that by measuring pion and kaon form factors in e+e scattering one can predict the widths of decays such as τ → ππν and τ → KKν. In chapter 6, non-local hadronic matrix elements are introduced to extend the formalism to deal with decays such as π → γγ and B → Kμμ.

The book shifts gears in chapters 7–10. Here, QCD is used to calculate hadronic matrix elements. Chapter 7 covers the calculation of the form factors in the infinite momentum frame, whereby the asymptotic form factor can be expressed in terms of the pion decay constant and a pion distribution amplitude describing the momentum distribution between two valence partons in the pion. In chapter 8, the QCD sum rules are introduced. The two-point correlation of quark current operators can be calculated in perturbative QCD at large space-like momenta, and the result is expressed in terms of perturbative contributions and the QCD vacuum condensates. This can then be related through the sum rule to the hadronic degrees of freedom in the time-like region. Such sum rules are used to gain information on both condensate densities or quark masses from accurate hadronic data and hadronic decay constants and masses from QCD calculations. The connection is made to parton–hadron duality and to the operator product expansion. Some illustrative examples of the technique, such as the calculation of the strange-quark mass and the pion decay constant, are also given. Chapter 9 concerns the light-cone expansion and light-cone dominance, which is then used to explain the role of light-cone sum rules in chapter 10. The use of these sum rules in calculating hadron form factors is illustrated with the pion form factor and also with the heavy-to-light form factors necessary for B → π, B → K, D → π, D → K and B → D decays.

Overall, this book is not an easy read, but there are many useful insights. This is essentially a textbook, and a valuable reference work that belongs in the libraries of particle-physics institutes around the world.

The post Form follows function in QCD appeared first on CERN Courier.

]]>
Review Hadron form factors: From Basic Phenomenology to QCD Sum Rules is a valuable reference work, writes Amanda Cooper-Sarkar. https://cerncourier.com/wp-content/uploads/2022/02/Hadron-from-factors.png
Your Adventures at CERN: Play the Hero Among Particles and a Particular Dinosaur! https://cerncourier.com/a/your-adventures-at-cern-play-the-hero-among-particles-and-a-particular-dinosaur/ Wed, 09 Mar 2022 10:52:27 +0000 https://preview-courier.web.cern.ch/?p=97695 Filled with brain-tickling facts about particles and science wonders, Letizia Diamante’s debut book will engage children of all ages.

The post Your Adventures at CERN: Play the Hero Among Particles and a Particular Dinosaur! appeared first on CERN Courier.

]]>
Your adventures at CERN

Billed as a bizarre adventure filled with brain-tickling facts about particles and science wonders, Your Adventures at CERN invites young audiences to experience a visit to CERN in different guises.

The reader can choose one of three characters, each with a different story: a tourist, a student and a researcher. The stories are intertwined, and the choice of the reader’s actions through the book changes their journey, rather than following a linear chronology. The stories are filled with puzzles, mazes, quizzes and many other games that challenge the reader. Engaging physics references and explanations, as well as the solutions to the quizzes, are given at the back of the book.

Author Letizia Diamante, a biochemist turned science communicator who previously worked in the CERN press office, portrays the CERN experience in an engaging and understandable way. The adventures are illustrated with funny jokes and charismatic characters, such as “Schrödy”, a hungry cat that guides the reader through the adventures in exchange for food. Detailed hand-drawn illustrations by Claudia Flandoli are included, together with photographs of CERN facilities that take the reader directly into the heart of the lab. Moreover, the book includes several historical facts about particle physics and other topics, such as the city of Geneva and the extinct dinosaurs from the Jurassic era, which is named after the nearby Jura mountains on the border between France and Switzerland. A particle-physics glossary and extra information, such as fun cooking recipes, are also included at the end.

Although targeted mainly at children, this book is also suitable for teenagers and adults looking for a soft introduction to high-energy physics and CERN, offering a refreshing addition to the more mainstream popular particle-physics literature.

The post Your Adventures at CERN: Play the Hero Among Particles and a Particular Dinosaur! appeared first on CERN Courier.

]]>
Review Filled with brain-tickling facts about particles and science wonders, Letizia Diamante’s debut book will engage children of all ages. https://cerncourier.com/wp-content/uploads/2022/03/Your-adventures-at-CERN_featured.jpg
Fear of a Black Universe: an outsider’s guide to the future of physics https://cerncourier.com/a/fear-of-a-black-universe-an-outsiders-guide-to-the-future-of-physics/ Wed, 09 Mar 2022 10:51:22 +0000 https://preview-courier.web.cern.ch/?p=97681 Stephon Alexander’s Fear of a Black Universe makes a strong case for deviance in research and academia.

The post Fear of a Black Universe: an outsider’s guide to the future of physics appeared first on CERN Courier.

]]>
Fear of a black universe feature

Stephon Alexander is a professor of theoretical physics at Brown University, specialising in cosmology, particle physics and quantum gravity. He is also a self-professed outsider, as the subtitle of his latest book Fear of a Black Universe suggests. His first book, The Jazz of Physics, was published in 2016. Fear of a Black Universe is a rallying cry for anyone who feels like a misfit because their identity or outside-the-box thinking doesn’t mesh with cultural norms. By interweaving historical anecdotes and personal experiences, Alexander shows how outsiders drive innovation by making connections and asking questions insiders might dismiss as trivial.

Alexander is Black and internalised his outsider sense early in his career. As a postdoc in the early 2000s, he found that his attempts to engage with other postdocs in his group were rebuffed. He eventually learned from his friend Brian Keating, who is white, the reason why: “They feel that they had to work so hard to get to the top and you got in easily, through affirmative action”. Instead of finding his peers’ rejection limiting, Alexander reinterpreted their dismissal as liberating: “I’ve come to realise that when you fit in, you might have to worry about maintaining your place in the proverbial club… so I eventually became comfortable being the outsider. And since I was never an insider, I didn’t have to worry that colleagues might laugh at me for my unlikely approach.”

Instead of finding his peers’ rejection limiting, Alexander reinterpreted their dismissal as liberating

Alexander argues that true breakthroughs come from “deviants”. He draws parallels between outsiders in physics and graffiti artists, who were considered vandals until the art world recognised their talent and contributions. Alexander recounts his own “deviance” in a humorous and sometimes  self-deprecating manner. He recalls a talk he gave at a conference about his first independent paper, which involved reinterpreting the universe as a three-dimensional membrane orbiting a five-dimensional black hole. During the talk he was often interrupted, eventually prompting a well-respected Indian physicist to stand up and shout “Let him finish! No one ever died from theorising.”

Alexander took these words to heart, and asks his readers to do the same during the speculative discussions in the second part of his book. Here, Alexander intersperses mainstream physics with some of his self-described “strange” ideas, acknowledging that some readers might write him off as an “oddball crank”. He explores the intersection of physics with philosophy, biology, consciousness, and searches for extraterrestrial life. Some sections – such as the chapter on alien quantum computers generating the effect of dark energy – feel more like science fiction than science. But Alexander reassures readers that, while many of his ideas are strange, so are many experimentally verified tenants of physics. “In fact, the likelihood that any one of us will create a new paradigm because we have violated the norms… is very slim” he observes.

Science wise, this book is not for the faint-hearted. While many other public-facing physics books slowly wade readers into early-20th-century physics and touch on more abstract concepts only in the final chapters, part I of Fear of a Black Universe dives directly into relativity, quantum mechanics and emergence. Part II then launches into a much deeper discussion about supersymmetry, baryogenesis, quantum gravity and quantum computing. But the strength of Alexander’s new work isn’t in its retellings of Einstein’s thought experiments or even its deconstruction of today’s cosmological enigma. More than anything, this book makes a case for cultivating diversity in science that goes beyond “gesticulations of identity politics”.

Fear of a Black Universe is both mind-bending and refreshing. It approaches physics with a childlike curiosity and allows the reader to playfully contemplate questions many have but few discuss for fear of sounding like a crank. This book will be enjoyable for scientists and science enthusiasts who can set cultural norms aside and just enjoy the ride.

The post Fear of a Black Universe: an outsider’s guide to the future of physics appeared first on CERN Courier.

]]>
Opinion Stephon Alexander’s Fear of a Black Universe makes a strong case for deviance in research and academia. https://cerncourier.com/wp-content/uploads/2022/02/Enjoy-the-ride.jpg
Commemorating Bruno Touschek’s centenary https://cerncourier.com/a/commemorating-the-centenary-of-bruno-touschek/ Mon, 07 Feb 2022 16:32:53 +0000 https://preview-courier.web.cern.ch/?p=97266 In December, a memorial symposium was held in Rome to celebrate the life and scientific contributions of Bruno Touschek.

The post Commemorating Bruno Touschek’s centenary appeared first on CERN Courier.

]]>
touschek_featured_img

Bruno Touschek was born in Vienna on 3 February 1921. His mother came from a well-to-do Jewish family and his father was a major in the Austrian Army. Bruno witnessed the tragic consequences of racial discrimination that prevented him from both completing his high school and university studies in Austria. But he also experienced the hopes of the post-war era and played a role in the post-war reconstruction.  With the help of his friends, he continued his studies in Hamburg, where he worked on the 15 MeV German betatron proposed by Rolf Widerøe and learnt about electron accelerators. After the war he obtained his PhD at the University of Glasgow in 1949 , where he was involved in theoretical studies and in the building of a 300 MeV electron synchrotron. Touschek emerged from the early-post war years as one of the first physicists in Europe endowed with a unique expertise in the theory and functioning of accelerators. His genius was nurtured by close exchanges with Arnold Sommerfeld, Werner Heisenberg, Max Born and Wolfgang Pauli, among others, and flourished in Italy, where he arrived in 1953 called by Edoardo Amaldi, his first biographer and first Secretary-General of CERN.

In 1960 he proposed and built the first electron-positron storage ring, Anello di Accumulazione (AdA), which started operating in Frascati in February 1961. The following year, in order to improve the injection efficiency, a Franco-Italian collaboration was born that brought AdA to Orsay. It was here that the “Touschek effect“, describing the loss and scattering of charged particles in storage rings, was discovered and the proof of collisions in an electron-positron ring was obtained.

AdA paved the way to the electron-positron colliders ADONE in Italy, ACO in France, VEPP-2 in the USSR and SPEAR in the US. Bruno spent the last year of his life at CERN, from where – already quite ill – he was brought to Innsbruck, Austria, where he passed away on 25 May 1978 aged just 57.

touschek_sketch

Bruno Touschek’s  life and scientific contributions were celebrated at a memorial symposium from 2 to 4 December, held in the three institutions where Touschek has left a lasting legacy: Sapienza University of Rome, INFN Frascati National Laboratories and Accademia Nazionale dei Lincei. Contributions also came from the Irène Joliot-Curie Laboratoire, and sponsorship from the Austrian Embassy in Italy.

In addition to Touschek’s impact on the physics of particle colliders, the three-day symposium addressed the present-day landscape. Carlo Rubbia and Ugo Amaldi gave a comprehensive overview of the past and future of particle colliders, followed by talks about physics at ADONE and LEP, and future machines, such as a muon collider, the proposed Future Circular Collider at CERN and the Circular Electron Positron Collider in China, as well as new developments in accelerator techniques. ADONE’s construction challenges were remembered. Developments in particle physics since the 1960s – including the quark model, dual models and string theory, spontaneous symmetry breaking and statistical physics – were described in testimonies from the  universities of Rome, Frascati, Nordita and Collège de France.

Touschek’s direct influence was captured in talks by his former students, from Rome and the Frascati theory group, which he founded in the mid 1960s. His famous lectures on statistical mechanics, given from 1959 to 1960, were remembered by many speakers. Giorgio Parisi, who graduated with Nicola Cabibbo, recollected the years in Frascati after the observation of a large hadron multiplicity in e+ e annihilations made by ADONE, and the ideas leading to QCD.

The final day of the symposium, which took place at the Accademia dei Lincei where Touschek had been a foreign member since 1972, turned to future strategies in high-energy physics, including neutrinos and other messengers from the universe. Also prominent were the many benefits brought to society by particle accelerators, reaffirming the intrinsic broader value of fundamental research.

Touschek’s life and scientific accomplishments have been graphically illustrated in the three locations of the symposium, including displays of his famous drawings on academic life in Roma and Frascati. LNF’s visitor center was dedicated to Touschek, in the presence of his son Francis Touschek.

The post Commemorating Bruno Touschek’s centenary appeared first on CERN Courier.

]]>
Meeting report In December, a memorial symposium was held in Rome to celebrate the life and scientific contributions of Bruno Touschek. https://cerncourier.com/wp-content/uploads/2022/02/touschek_feat_img.png
One day in September: Copenhagen https://cerncourier.com/a/one-day-in-september-copenhagen/ Tue, 21 Dec 2021 12:14:20 +0000 https://preview-courier.web.cern.ch/?p=96741 A new production of Michael Frayn’s masterwork Copenhagen contains little action but much physics and food for thought, finds Letizia Diamante.

The post One day in September: Copenhagen appeared first on CERN Courier.

]]>
The ghosts of Niels Bohr, Werner Heisenberg and Margrethe Bohr

“But why?” asks Margrethe Bohr. Her husband, Niels, replies “Does it matter my love now that we’re all three of us dead and gone?” Alongside Werner Heisenberg, the trio look like spirits meeting in an atemporal dimension, maybe the afterlife, under an eerie ring of light. Dominating an almost empty stage, they try to revive what happened on one day in September 1941, when Heisenberg, a prominent figure in Hitler’s Uranverein (Uranium Club), travelled to Nazi-occupied Denmark to visit his former mentor, Niels Bohr. 

Why did Heisenberg go to meet Bohr that day? Did he seek an agreement not to develop the bomb in Germany? Was he searching for intelligence on Allied progress? To convince Bohr that there was no German programme? Or to pick Bohr’s brain on atomic physics? Or, according to Margrethe, to show off? Perhaps his motives were a superposition of all of these. No one knows what was said. This puzzle has intrigued historians ever since. 

Eighty years after that meeting, and 23 since Michael Frayn’s masterwork Copenhagen premiered at the National Theatre in London, award-winning director Polly Findlay and Emma Howlett in her professional directorial debut have revived a play that contains little action but much physics and food for thought.

The three actors orbit like electrons in an atom

Frayn’s nonlinear script is based on three possible versions of the same meeting in Copenhagen in 1941, which can be construed as three different scenarios playing out in the many-worlds interpretation of quantum mechanics. He describes it as the process of rewriting a draft of a paper again and again, trying to unlock more secrets. In the afterlife, the trio’s dialogue jumps back and forth in time, adding confusing memories and contradicting hypotheses. Delivered at pace, the narrative explores historical information and their personal stories.

The three characters reflect on how German scientists failed to build the bomb, even though they had the best start; Otto Hahn, Lise Meitner and Fritz Strassmann having discovered nuclear fission in 1939. But Frayn highlights how Hitler’s Deutsche Physik was hostile to so-called Jewish physics and key Jewish physicists, including Bohr, who later fled to Los Alamos in the US. Frayn’s Heisenberg reveals the disbelief he felt when he learnt about the destruction of Hiroshima on the radio. At the time he was detained in Farm Hall, not far from this theatre in Cambridge in the UK, together with other members of the Uranium Club. Called Operation Epsilon, the bugged hall was used by the Allied forces to try to uncover the state of Nazi scientific progress.

The three actors orbit like electrons in an atom, while the theatre’s revolving stage itself spins. Superb acting by Philip Arditti and Malcolm Sinclair elucidates an extraordinary student–mentor relationship between Heisenberg and Bohr. The sceptical Mrs Bohr (Haydn Gwynne) steers the conversation and questions their friendship, cajoling Bohr to speak in plain language. Nevertheless, the use of scientific jargon could leave some non-experts in the audience behind. 

Although Heisenberg wrote in his autobiography that “it would be better to stop disturbing the spirits of the past,” the private conversation between the two physicists has stirred the interest of the public, journalists and historians for years. In 1956 the journalist Robert Jungk wrote in his debated book, Brighter than a Thousand Suns, that Heisenberg wanted to prevent the development of an atomic bomb. This book was also an inspiration for Frayn’s play. More recently, in 2001, Bohr’s family released some letters that Bohr wrote and never sent to Heisenberg. According to these letters, Bohr was convinced that Heisenberg was building the bomb in Germany.

To this day, the reason for Heisenberg’s visit to Copenhagen remains uncertain, or unknowable, like the properties of a quantum particle that’s not observed. The audience can only imagine what really happened, while considering all philosophical interpretations of the fragility of the human species. 

The post One day in September: Copenhagen appeared first on CERN Courier.

]]>
Review A new production of Michael Frayn’s masterwork Copenhagen contains little action but much physics and food for thought, finds Letizia Diamante. https://cerncourier.com/wp-content/uploads/2021/12/CCJanFeb22_REV_copenhagen.jpg
Hadron colliders in perspective https://cerncourier.com/a/hadron-colliders-in-perspective/ Wed, 15 Dec 2021 11:04:36 +0000 https://preview-courier.web.cern.ch/?p=96449 CERN's celebration of 50 years of hadron colliders in October offered a feast of physics and history.

The post Hadron colliders in perspective appeared first on CERN Courier.

]]>
From visionary engineer Rolf Widerøe’s 1943 patent for colliding beams, to the high-luminosity LHC and its possible successor, the 14 October symposium “50 Years of Hadron Colliders at CERN” offered a feast of physics and history to mark the 50th anniversary of the Intersecting Storage Rings (ISR). Negotiating the ISR’s steep learning curve in the 1970s, the ingenious conversion of the Super Proton Synchrotron (SPS) into a proton–antiproton collider (SppS) in the 1980s, and the dramatic approval and switch-on of the LHC in the 1990s and 2000s chart a scientific and technological adventure story, told by its central characters in CERN’s main auditorium.

Former CERN Director-General (DG) Chris Llewellyn Smith swiftly did away with notions that the ISR was built without a physics goal. Viki Weisskopf (DG at the time) was well aware of the quark model, he said, and urged that the ISR be built to discover quarks. “The basic structure of high-energy collisions was discovered at the ISR, but you don’t get credit for it because it is so obvious now,” said Llewellyn Smith. Summarising the ISR physics programme, Ugo Amaldi, former DELPHI spokesperson and a pioneer of accelerators for hadron therapy, listed the observation of charmed-hadron production in hadronic interactions, studies of the Drell–Yan process, and measurements of the proton structure function as ISR highlights. He also recalled the frustration at CERN in late 1974 when the J/ψ meson was discovered at Brookhaven and SLAC, remarking that history would have changed dramatically had the ISR detectors also enabled coverage at high transverse momentum.

A beautiful machine

Amaldi sketched the ISR’s story in three chapters: a brilliant start followed by a somewhat difficult time, then a very active and interesting programme. Former CERN director for accelerators and technology Steve Myers offered a first-hand account, packed with original hand-drawn plots, of the battles faced and the huge amount learned in getting the first hadron collider up and running. “The ISR was a beautiful machine for accelerator physics, but sadly is forgotten in particle physics,” he said. “One of the reasons is that we didn’t have beam diagnostics, on account of the beam being a coasting beam rather than a bunched beam, which made it really hard to control things during physics operation.” Stochastic cooling, a “huge surprise”, was the ISR’s most important legacy, he said, paving the way for the SppS and beyond.

Former LHC project director Lyn Evans took the baton, describing how the confluence of electroweak theory, the SPS as collider and stochastic cooling led to rapid progress. It started with the Initial Cooling Experiment in 1977–1978, then the Antiproton Accumulator. It would take about 20 hours to produce a bunch dense enough for injection into the SppS , recalled Evans, and several other tricks to battle past the “26 GeV transition, where lots of horrible things” happened. At 04:15 on 10 July 1981, with just him and Carlo Rubbia in the control room, first collisions at 270 GeV at the SppS were declared.

Poignantly, Evans ended his presentation “The SPS and LHC machines” there. “The LHC speaks for itself really,” he said. “It is a fantastic machine. The road to it has been a long and very bumpy one. It took 18 years before the approval of the LHC and the discovery of the Higgs. But we got there in the end.”

Discovery machines

The parallel world of hadron-collider experiments was brought to life by Felicitas Pauss, former CERN head of international relations, who recounted her time as a member of the UA1 collaboration at the SppS during the thrilling period of the W and Z discoveries. Jumping to the present day, early-career researchers from the ALICE, ATLAS, CMS and LHCb collaborations brought participants up to date with the progress at the LHC in testing the Standard Model and the rich physics prospects at Run 3 and the HL-LHC.

Few presentations at the symposium did not mention Carlo Rubbia, who instigated the conversion of the SPS into a hadron collider and was the prime mover of the LHC, particularly, noted Evans, during the period when the US Superconducting Super Collider was under construction. His opening talk presented a commanding overview of colliders, their many associated Nobel prizes and their applications in wider society.

During a brief Q&A at the end of his talk, Rubbia reiterated his support for a muon collider operating as a Higgs factory in the LHC tunnel: “The amount of construction is small, the resources are reasonable, and in my view it is the next thing we should do, as quickly as possible, in order to make sure that the Higgs is really what we think it is.”

It seems in hindsight that the LHC was inevitable, but it was anything but

Christopher Llewellyn Smith

In a lively and candid presentation about how the LHC got approved, Llewellyn Smith also addressed the question of the next collider, noting it will require the unanimous support of the global particle-physics community, a “reasonable” budget envelope and public support. “It seems in hindsight that the LHC was inevitable, but it was anything but,” he said. “I think going to the highest energy is the right way forward for CERN, but no government is going to fund a mega project to reduce error bars – we need to define the physics case.”

Following a whirlwind “view from the US”, in which Young-Kee Kim of the University of Chicago described the Tevatron and RHIC programmes and collated congratulatory messages from the US Department of Energy and others, CERN DG Fabiola Gianotti rounded off proceedings with a look at the future of the LHC and beyond. She updated participants on the significant upgrade work taking place for the HL-LHC and on the status of the Future Circular Collider feasibility study, a high-priority recommendation of the 2020 update of the European strategy for particle physics which is due to be completed in 2025. “The extraordinary success of the LHC is the result of the vision, creativity and perseverance of the worldwide high-energy physics community and more than 30 years of hard work,” the DG stated. “Such a success demonstrates the strength of the community and it’s a necessary milestone for future, even more ambitious, projects.”

Videos from the one-off symposium, capturing the rich interactions between the people who made hadron colliders a reality, are available here.

The post Hadron colliders in perspective appeared first on CERN Courier.

]]>
Meeting report CERN's celebration of 50 years of hadron colliders in October offered a feast of physics and history. https://cerncourier.com/wp-content/uploads/2021/12/50-years-of-hadron-colliders-at-CERN-featured.jpg
The inexplicable neutrino https://cerncourier.com/a/the-inexplicable-neutrino/ Wed, 06 Oct 2021 08:15:01 +0000 https://preview-courier.web.cern.ch/?p=95388 Ghost Particle captures the human spirit surrounding the birth of a modern particle-physics detector.

The post The inexplicable neutrino appeared first on CERN Courier.

]]>
Claustrophobia. South Dakota. A clattering elevator lowers a crew of hard-hat-clad physicists 1500 metres below the ground. 750,000 tonnes of rock are about to be excavated from this former gold mine at the Sanford Underground Research Facility (SURF) to accommodate the liquid-argon time projection chambers (TPCs) of the international Deep Underground Neutrino Experiment (DUNE). Towards the end of the decade, DUNE will track neutrinos that originate 1300 km away at Fermilab in Chicago, addressing leptonic CP violation as well as an ambitious research programme in astrophysics.

Having set the scene, director Geneva Guerin, co-founder of Canadian production company Cinécoop, cuts to a wide expanse: a climber scaling a rock face near the French–Swiss border. Francesca Stocker, the star of the film and then a PhD student at the University of Bern, narrates, relating the scientific method to rock climbing. Stocker and her fellow protagonists are engaging, and the film vividly captures the human spirit surrounding the birth of a modern particle-physics detector.

I don’t think it is possible to explain a neutrino for a general audience

Geneva Guerin

But the viewer is not allowed to settle for long in any one location. After zipping to CERN, and a tour through its corridors accompanied by eerie cello music, we meet Stocker in her home kitchen, explaining how she got interested in science as a child. Next, we hop to Federico Sánchez, spokesperson of the T2K experiment in Japan, explaining the basics of the Standard Model. 

Ghost Particle

T2K, and its successor Hyper-Kamiokande, DUNE’s equal in ambition and scope, both feature in the one-hour-long film. But the focus is on the development of the prototype DUNE detector modules that have been designed, built and tested at the CERN Neutrino Platform – and here the film is at its best. Guerin had full access to protoDUNE activities, allowing her to immerse the viewer with the peculiar but oddly fitting accompaniment of a solo didgeridoo inside the protoDUNE cryostat. We gatecrash celebrations when the vessel was filled with liquid argon and the first test-beam tracks were recorded. The film focuses on detailed descriptions of the workings of TPCs and other parts of the apparatus rather than accessible explanations of the neutrino’s fascinating and mysterious nature. Unformatted plots and graphics are pulled from various sources. While authentic, this gives the film an unpolished, home-made feel.

Given the density of the exposition in some parts, beyond the most enthusiastic popular-science fans, Ghost Particle seems best tailored for physics students encountering experimental neutrino physics for the first time – a point that Guerin herself made during a live Q&A following the CineGlobe screening: “I was aiming at people like me – those who love science documentaries,” she told the capacity crowd. “Originally I envisaged a three-part series over a decade or more, but I realised that I don’t think it is possible to explain a neutrino for a general audience, so maybe it’s something for educational purposes, to help future generations get introduced to this exciting programme.”

The film ends as it began, with the rickety elevator continuing its 12-minute descent into the bowels of the Earth.

The post The inexplicable neutrino appeared first on CERN Courier.

]]>
Review Ghost Particle captures the human spirit surrounding the birth of a modern particle-physics detector. https://cerncourier.com/wp-content/uploads/2021/10/GP_3.png
Quantum gravity in the Vatican https://cerncourier.com/a/quantum-gravity-in-the-vatican/ Tue, 31 Aug 2021 21:40:13 +0000 https://preview-courier.web.cern.ch/?p=94066 Residents of the Vatican Observatory describe life as a full-time physicist in the church.

The post Quantum gravity in the Vatican appeared first on CERN Courier.

]]>
Gabriele Gionti with Pope Francis

“Our job is to be part of the scientific community and show that there can be religious people and priests who are scientists,” says Gabriele Gionti, a Roman Catholic priest and theoretical physicist specialising in quantum gravity who is resident at the Vatican Observatory.

“Our mission is to do good science,” agrees Guy Consolmagno, a noted planetary scientist, Jesuit brother and the observatory’s director. “I like to say we are missionaries of science to the believers.”

Not only missionaries of faith, then, but also of science. And there are advantages.

“At the Vatican Observatory, we don’t have to write proposals, we don’t have to worry about tenure and we don’t have to have results in three years to get our money renewed,” says Consolmagno, who is directly appointed by the Pope. “It changes the nature of the research that is available to us.”

“Here I have had time to just study,” says Gionti, who explains that he was able to extend his research to string theory as a result of this extra freedom. “If you are a postdoc or under tenure, you don’t have this opportunity.”

“I remember telling a friend of mine that I don’t have to write grant proposals, and he said, ‘how do I get in on this?’” jokes Consolmagno, a native of Detroit. “I said that he needed to take a vow of celibacy. He replied, ‘it’s worth it!’.”

Cannonball moment

Clad in T-shirts, Gionti and Consolmagno don’t resemble the priests and monks seen in movies. They are connected to monastic tradition, but do not withdraw from the world. As well as being full-time physicists, both are members of the Society of Jesus – a religious order that traces its origin to 1521, when Saint Ignatius of Loyola was struck in the leg by a cannonball at the Battle of Pamplona. Today they help staff at an institution that was founded in 1891, though its origins arguably date back to attempts to fix the date for Easter in 1582.

“It was at the end of the 19th century that the myth began that the church was anti-science, and they would use Galileo as the excuse,” says Consolmagno, explaining that the Pope at the time, Pope Leo XIII, wanted to demonstrate that faith and science were fully compatible. “The first thing that the Vatican Observatory did was to take part in the Carte du Ciel programme,” he says, hinting at a secondary motivation. “Every national observatory was given a region of the sky. Italy was given one region and the Vatican was given another. So, de facto, the Vatican became seen as an independent nation state.”

Guy Consolmagno poses with a summer student

The observatory quickly established itself as a respected scientific organisation. Though it is staffed by priests and brothers, there is an absolute rule that science comes first, says Consolmagno, and the stereotypical work of a priest or monk is actually a temptation to be resisted. “Day-to-day life as a scientist can be tedious, and it can be a long time until you see a reward, but pastoral life can be rewarding immediately,” he explains.

Consolmagno was a planetary scientist for 20 years before becoming a Jesuit. By contrast, Gionti, who hails from Capua in Italy, joined after his first postdoc at UC Irvine in California. Neither reports encountering professional prejudice as a result of their vocation. “I think that’s a generational thing,” says Consolmagno. “Scientists working in the 1970s and 1980s were more likely to be anti-religious, but nowadays it’s not the case. You are looked on as part of the multicultural nature of the field.”

And besides, antagonism between science and religion is largely based on a false dichotomy, says Consolmagno. “The God that many atheists don’t believe in is a God that we also don’t believe in.”

The observatory’s director pushes back hard on the idea that faith is incompatible with physics. “It doesn’t tell me what science to do. It doesn’t tell me what the questions and answers are going to be. It gives me faith that I can understand the universe using reason and logic.” 

Surprised by CERN

Due to light pollution in Castel Gandolfo, a new outpost of the Vatican Observatory was established in Tucson, Arizona, in 1980. A little later in the day, when the Sun was rising there, I spoke to Paul Gabor – an astrophysicist, Jesuit priest and deputy director for the Tucson observatory. Born in Košice, Slovakia, Gabor was a summer student at CERN in 1992, working on the development of the electromagnetic calorimeter of the ATLAS experiment, a project he later continued in Grenoble, thanks to winning a scholarship at the university. “We were making prototypes and models and software. We tested the actual physical models in a couple of test-beam runs – that was fun,” he recalls.

Gabor was surprised at how he found the laboratory. “It was an important part of my journey, because I was quite surprised that I found CERN to be full of extremely nice people. I was expecting everyone to be driven, ambitious, competitive and not necessarily collaborative, but people were very open,” he says. “It was a really good human experience for me.”

“When I finally caved in and joined the Jesuit order in 1995, I always thought, well, these scientists definitely are a group that I got to know and love, and I would like to, in one way or another, be a minister to them and be involved with them in some way.”

“Something that I came to realise, in a beginning, burgeoning kind of way at CERN, is the idea of science being a spiritual journey. It forms your personality and your soul in a way that any sustained effort does.”

Scientific athletes

“Experimental science can be a journey to wisdom,” says Gabor. “We are subject to constant frustration, failure and errors. We are confronted with our limitations. This is something that scientists have in common with athletes, for example. These long labours tend to make us grow as human beings. I think this point is quite important. In a way it explains my experience at CERN as a place full of nice, generous people.”

Surprisingly, however, despite being happy with life as a scientific religious and religious scientist, Gabor is not recruiting.

“There is a certain tendency to abandon science to join the priesthood or religious life,” he says. “This is not necessarily the best thing to do, so I urge a little bit of restraint. Religious zeal is a great thing, but if you are in the third year of a doctorate, don’t just pack up your bags and join a seminary. That is not a very prudent thing to do. That is to nobody’s benefit. This is a scenario that is all too common unfortunately.”

Consolmagno also offers words of caution. “50% of Jesuits leave the order,” he notes. “But this is a sign of success. You need to be where you belong.”

But Gionti, Consolmagno and Gabor all agree that, if properly discerned, the life of a scientific religious is a rewarding one in a community like the Vatican Observatory. They describe a close-knit group with a common purpose and little superficiality.

“Faith gives us the belief that the universe is good and worth studying,” says Consolmagno. “If you believe that the universe is good, then you are justified in spending your life studying things like quarks, even if it is not useful. Believing in God gives you a reason to study science for the sake of science.”

The post Quantum gravity in the Vatican appeared first on CERN Courier.

]]>
Careers Residents of the Vatican Observatory describe life as a full-time physicist in the church. https://cerncourier.com/wp-content/uploads/2021/08/CCSepOct21_CAREERS_pope_feature.jpg
What if scientists ruled the world? https://cerncourier.com/a/what-if-scientists-ruled-the-world/ Thu, 15 Jul 2021 19:55:59 +0000 https://preview-courier.web.cern.ch/?p=93325 Interactive theatre performance What if scientists ruled the world? provided a welcome opportunity for scientists to reflect on how best to communicate their research.

The post What if scientists ruled the world? appeared first on CERN Courier.

]]>
A chemistry professor invents a novel way to produce chemical compounds, albeit with a small chance of toxicity. A paper is published. A quick chat with a science communicator leads to a hasty press release. But when the media picks up on it, the story is twisted.

“What if scientists ruled the world?” — a somewhat sensational but thought-provoking title for a play — is an interactive theatre production by the Australian Academy of Science in partnership with Falling Walls Engage. Staged on 8 May at the Shine Dome in Canberra, Australia, a hybrid performance explored the ramifications of an ill-considered press release, and provided a welcome opportunity for scientists to reflect on how best to communicate their research. The dynamic exchange of ideas between science experts and laypeople in the audience highlighted the power of words, and how they are used to inform, persuade, deceive or confuse. 

What if scientists ruled the world?

After setting the scene, director Ali Clinch invited people participating remotely on Zoom and via a YouTube livestream to guide the actors’ actions, helping to advance and reframe the storyline with their ideas, questions and comments. Looking at the same story from different points of view invited the audience to think about the different stakeholders and their responsibility in communicating science. In the first part of the performance, for example, the science communicator talks excitedly about her job with students, but later has to face a crisis that the busy professor is unable or unwilling to deal with. At a critical point in the story, when a town-hall meeting is held to debate the future of a company that employs most of the people in the town, but which probably produced the same toxic chemical, everybody felt part of the performance. The audience could even take the place of an actor, or act in a new role.

The play highlighted the pleasures and tribulations of work at the interface between research and public engagement

The play highlighted the pleasures and tribulations of work at the interface between research and public engagement during euphoric discoveries and crisis moments alike, and has parallels both with the confusion encountered during the early stages of the COVID-19 pandemic and misguided early fears that the LHC could generate a black hole. In an age of fake news, sensationalism and misinformation, the performance adeptly highlighted the complexities and vested interests inherent in science communication today.

The post What if scientists ruled the world? appeared first on CERN Courier.

]]>
Review Interactive theatre performance What if scientists ruled the world? provided a welcome opportunity for scientists to reflect on how best to communicate their research. https://cerncourier.com/wp-content/uploads/2021/07/What-if-scientists-1000.jpg
A relational take on quantum mechanics https://cerncourier.com/a/a-relational-take-on-quantum-mechanics/ Wed, 14 Jul 2021 07:31:46 +0000 https://preview-courier.web.cern.ch/?p=92971 Carlo Rovelli’s Helgoland is a well-written and easy-to-follow exploration of quantum mechanics and its interpretation.

The post A relational take on quantum mechanics appeared first on CERN Courier.

]]>
Helgoland

It is often said that “nobody understands quantum mechanics” – a phrase usually attributed to Richard Feynman. This statement may, however, be misleading to the uninitiated. There is certainly a high level of understanding of quantum mechanics. The point, moreover, is that there is more than one way to understand the theory, and each of these ways requires us to make some disturbing concessions.

Carlo Rovelli’s Helgoland is therefore a welcome popular book – a well-written and easy-to-follow exploration of quantum mechanics and its interpretation. Rovelli is a theorist working mainly on quantum gravity and foundational aspects of physics. He is also a very successful popular author, distinguished by his erudition and his ability to illuminate the bigger picture. His latest book is no exception.

Helgoland is a barren German island of the North Sea where Heisenberg co-invented quantum mechanics in 1925 while on vacation. The extraordinary sequence of events between 1925 and 1926, when Heisenberg, Jordan, Born, Pauli, Dirac and Schrödinger formulated quantum mechanics, is the topic of the opening chapter of the book. 

Helgoland cover

Rovelli only devotes a short chapter to discuss interpretations in general. This is certainly understandable, since the author’s main target is to discuss his own brainchild: relational quantum mechanics. This approach, however, does not do justice to popular ideas among experts, such as the many-worlds interpretation. The reader may be surprised not to find anything about the Copenhagen (or, more appropriately, Bohr’s) interpretation. This is for very good reason, however, since it is not generally considered to be a coherent interpretation. Having mostly historical significance, it has served as inspiration to approaches that keep the spirit of Bohr’s ideas, like consistent histories (not mentioned in the book at all), or Rovelli’s relational quantum mechanics.

Relational quantum mechanics was introduced by Rovelli in an original technical article in 1996 (Int. J. Theor. Phys. 35 1637). Helgoland presents a simplified version of these ideas, explained in more detail in Rovelli’s article, and in a way suitable for a more general audience. The original article, however, can serve as very nice complementary reading for those with some physics background. Relational quantum mechanics claims to be compatible with several of Bohr’s ideas. In some ways it goes back to the original ideas of Heisenberg by formulating the theory without a reference to a wavefunction. The properties of a system are defined only when the system interacts with another system. There is no distinction between observer and observed system. Rovelli meticulously embeds these ideas in a more general historical and philosophical context, which he presents in a captivating manner. He even speculates whether this way of thinking can help us understand topics that, in his opinion, are unrelated to quantum mechanics, such as consciousness.

Helgoland’s potential audience is very diverse and manages to transcend the fact that it is written for the general public. Professionals from both the sciences and the humanities will certainly learn something, especially if they are not acquainted with the nuances of the interpretations of modern physics. The book, however, as is explicitly stated by Rovelli, takes a partisan stance, aiming to promote relational quantum mechanics. As such, it may give a somewhat skewed view of the topic. In that respect, it would be a good idea to read it alongside books with different perspectives, such as Sean Carroll’s Something Deeply Hidden (2019) and Adam Becker’s What is Real? (2018).

The post A relational take on quantum mechanics appeared first on CERN Courier.

]]>
Review Carlo Rovelli’s Helgoland is a well-written and easy-to-follow exploration of quantum mechanics and its interpretation. https://cerncourier.com/wp-content/uploads/2021/06/CCJulAug21_REV_Helgoland.jpg
Poland marks 30 years at CERN https://cerncourier.com/a/poland-marks-30-years-at-cern/ Tue, 29 Jun 2021 15:27:23 +0000 https://preview-courier.web.cern.ch/?p=93067 Three decades since the Polish flag was hoisted at the entrance to CERN, Tadeusz Lesiak recollects the genesis of Poland’s membership and reflects on its impact.

The post Poland marks 30 years at CERN appeared first on CERN Courier.

]]>
Rising up

When CERN was established in the 1950s, with the aim of bringing European countries together to collaborate in scientific research after the Second World War, countries from East and West Europe were invited to join. At the time, the only eastern country to take up the call was Yugoslavia. Poland’s accession to CERN membership in 1991 was therefore a particularly significant moment in the organisation’s history because it was the first country from behind the former Iron Curtain to join CERN. Its example was soon followed by a range of Eastern European countries throughout the 1990s.

At the origin of Polish participation at CERN was a vision of the three world-class physicists: Marian Danysz and Jerzy Pniewski from Warsaw and Marian Mięsowicz from Kraków, who had made first contacts with CERN in the early 1960s. The major domains of Polish expertise around that time encompassed the analysis of bubble-chamber data (especially those related to high-multiplicity interactions), the properties of strange hadrons, charm production, and the construction of gaseous detectors.

In 1963, Poland gained observer status at the CERN Council — the first country from Eastern Europe to do so. During the subsequent 25 years, almost out of nothing, a critical mass of national scientific groups collaborating with CERN on everyday basis was established. By the late 1980s, the CERN community recognised that Poles deserved full access to CERN. With the feedback and support of their numerous brilliant pupils, Danysz, Pniewski and Mięsowicz had accomplished a goal which had seemed impossible. Today, Poland’s red and white flag graces the membership rosters of all four major Large Hadron Collider (LHC) experiments and beyond.

Poland30 Wired in

Entering the fray
Poland joined CERN two years after the start-up of the Large Electron Positron Collider (LEP), the forerunner to the LHC. Having already made strong contributions to the
construction of LEP’s DELPHI experiment, in particular its silicon vertex detector, electromagnetic calorimeter and RICH detectors, Polish researchers quickly became involved in DELPHI data analyses, including studies of the properties of beauty baryons and searches for supersymmetric particles.

Poland’s accession to CERN membership 30 years ago was the very first case of the return of our nation to European structures

With the advent of the LHC era, Poles became members of all four major LHC-experiment collaborations. In ALICE we are proud of our broad contribution to the study of the quark gluon plasma using HBT-interferometry and electromagnetic probes, and of our participation in the design of and software development for the ALICE time projection chamber. Polish contributions to the ATLAS collaboration encompass not only numerous software and hardware activities (the latter concerning the inner detector and trigger), but also data analyses, notably searching for new physics in the Higgs sector, studies of soft and elastic hadron interactions and a central role in the heavy-ion programme. Involvement in CMS has revolved around the experiment’s muon-detection system, studies of Higgs-boson production and its decays to tau leptons, W+W interactions and searches for exotic, in particular long-lived, particles. This activity is also complemented by software development and coordination of physics analysis for the TOTEM experiment. Last but not least, Polish groups in LHCb have taken important hardware responsibilities for various subdetectors (including the VELO, RICH and high-level trigger) together with studies of b->s transitions, measurements of the angle γ of the CKM matrix and searches for CPT violation, to name but a few.

Beyond colliders
The scope of our research at CERN was never limited to LEP and the LHC. In particular, Polish researchers comprise almost one third of collaborators on the fixed-target experiment NA61/SHINE, where they are involved across the experiment’s strong-interactions programme. Indeed, since the late 1970s, Poles have actively participated in the whole series of deep-inelastic scattering experiments at CERN: EMC, NMC, SMC, COMPASS and recently AMBER. Devoted to studies of different aspects of the partonic structure of the nucleon, these experiments have resulted in spectacular discoveries, including the EMC effect, nuclear shadowing, the proton “spin puzzle”, and 3D imaging of the nucleon.

Poland30 Hands on

Polish researchers have also contributed with great success to studies at CERN’s ISOLDE facility. One of the most important achievements was to establish the coexistence of various nuclear shapes, including octupoles, at low excitation energy in radon, radium and mercury nuclei, using the Coulomb-excitation technique. Polish involvement in CERN neutrino experiments started with the BEBC bubble chamber, followed by the CERN Dortmund Heidelberg Saclay Warsaw (CDHSW​) experiment and, more recently, participation in the ICARUS experiment and the T2K near-detector as part of the CERN Neutrino Platform. In parallel, we take part in preparations for future CERN projects, including the proposed Future Circular Collider and Compact Linear Collider. In terms of theoretical research, Polish researchers are renowned for the phenomenological description of strong interactions and also play a crucial role in the elaboration of Monte Carlo software packages. In computing generally, Poland was the regional leader in implementing the grid computing platform.

The past three decades have brought a few-fold increase in the population of Polish engineers and technicians involved in accelerator science. Experts contributed significantly to the LHC construction, followed by the services (e.g. electrical quality assurance of the LHC’s superconducting circuits) during consecutive long shutdowns. Detector R&D is also a strong activity of Polish engineers and technicians, for example via membership of CERN’s RD51 collaboration which exists to advance the development and application of micropattern gas detectors. These activities take place in the closest cooperation with national industry, concentrated around cryogenic applications. Growing local expertise in accelerator science also saw the establishment of Poland’s first hadron-therapy centre, located at the Institute of Nuclear Physics PAN in Kraków.

Poland@CERN 2019 saw over 20 companies and institutions represented by around 60 participants take part in more than 120 networking meetings

Collaborations between CERN and Polish industry was initiated by Maciej Chorowski, and there are numerous examples. One is the purchase of vacuum vessels manufactured by CHEMAR in Kielce and RAFAKO in Racibórz, and parts of cryostats from METALCHEM in Kościan. Industrial supplies for CERN were also provided by KrioSystem in Wrocław and Turbotech in Płock, including elements of cryostats for testing prototype superconducting magnets for the LHC. CERN also operates devices manufactured by the ZPAS company in Wolibórz, while Polish company ZEC Service has been awarded CMS Gold awards for the delivery and assembly of cooling installations. Creotech Instruments – a company established by a physicist and two engineers who met at CERN – is a regular manufacturer of electronics for CERN and enjoys a strong collaboration with CERN’s engineering teams. Polish companies also transfer technology from CERN to industry, such as TECHTRA in Wrocław, which obtained a license from CERN for the production and commercialisation of GEM (Gas Electron Multiplier) foil. Deliveries to CERN are also carried out, inter alia, by FORMAT, Softcom or Zakład Produkcji Doświadczalnej CEBEA from Bochnia. At the most recent exhibition of Polish industry at CERN, Poland@CERN 2019, over 20 companies and institutions represented by around 60 participants took part in more than 120 networking meetings.

Poland30 Schools out

Societal impact
CERN membership has so far enabled around 550 Polish teachers to visit the lab, each returning to their schools with enhanced knowledge and enthusiasm to pass on to younger generations. Poland ranks sixth in Europe in terms of participation in particle-physics masterclasses participants, and at least 10 PhD theses in Poland based on CERN research are defended annually. Over the past 30 years, CERN has also become a second home for some 560 technical, doctoral or administrative students and 180 summer students, while Polish nationals have taken approximately 150 staff positions and 320 fellowships.

Some have taken important positions at CERN. Agnieszka Zalewska was chair of the CERN Council from 2013 to 2015, Ewa Rondio acted as a member of CERN’s directorate in 2009-2010 and Michał Turała chaired the electronics-and-computing-for-physics division in 1995-1998. Also, several of our colleagues were elected as members of CERN bodies such as the Scientific Policy Committee. Our national community at CERN is well integrated, and likes to pass the time outside working hours in particular during mountain hikes and summer picnics.

Poland’s accession to CERN membership 30 years ago was the very first case of the return of our nation to European structures, preceding the European Union and NATO. Poland joined the European Synchrotron Radiation Facility in 2004, the Institut Laue-Langevin in 2006 and the European Space Agency in 2012. It was also a founding member of the European Spallation Source and the Facility for Antiproton and Ion Research,and is a partner of the European X-ray Free-Electron Laser.

Today, six research institutes and 11 university departments located in eight major Polish cities are focused on high-energy physics. Among domestic projects that have benefitted from CERN technology-transfer is the Jagiellonian PET detector, which is exploring the use of inexpensive plastic scintillators for whole-body PET imaging, and the development of electron linacs for radiotherapy and cargo scanning at the National Centre for Nuclear Research in Świerk, Warsaw.

During the past few years, thanks to closer alignment between participation in CERN experiments and the national roadmap for research infrastructures, the long-term funding scheme for Poland’s CERN membership has been stabilised. This fact, together with the highlights described here, allow us to expect that in the future CERN will be even more “Polish”.

The post Poland marks 30 years at CERN appeared first on CERN Courier.

]]>
Feature Three decades since the Polish flag was hoisted at the entrance to CERN, Tadeusz Lesiak recollects the genesis of Poland’s membership and reflects on its impact. https://cerncourier.com/wp-content/uploads/2021/06/Poland30-Wired-in-Feature.jpg
Le Neutrino de Majorana https://cerncourier.com/a/le-neutrino-de-majorana/ Fri, 25 Jun 2021 07:23:26 +0000 https://preview-courier.web.cern.ch/?p=92986 Nils Barrellon’s novel Le Neutrino de Majorana is tailor-made for the entertainment of physicists and physics enthusiasts alike. 

The post Le Neutrino de Majorana appeared first on CERN Courier.

]]>
Le Neutrino de Majorana

Naples, 1938. Ettore Majorana, one of the physics geniuses of the 20th century, disappears mysteriously and never comes back. A tragedy, and a mystery that has captivated many writers. 

The latest oeuvre, Nils Barrellon’s Le Neutrino de Majorana, is a French-language detective novel situated somewhere at the intersection of physics history and science outreach. Beginning with Majorana’s birth in 1906, Barrellon highlights the events that shaped and established quantum mechanics. With factual moments and original letters, he focuses on Majorana’s personal and scholarly life, while putting a spotlight on the ragazzi di via Panisperna and other European physicists who had to face the Second World War. In parallel, a present-day neutrino physicist is found killed right at the border of France and Switzerland. Majorana’s volumetti (his unpublished research notes) become the leitmotif unifying the two stories. Barrellon compares the two eras of research by entangling the storylines to reach a dramatic climax.

Using the crime hook as the predominant storyline, the author keeps the lay reader on the edge of their seat, while comically playing with subtleties most Cernois would recognise, from cultural differences between the two bordering countries to clichés about particle physicists, via passably detailed procedures of access to the experimental facilities – a clear proof of the author (who is also a physics school teacher) having been on-site. The novel feels like a tailor-made detective story for the entertainment of physicists and physics enthusiasts alike. 

And, at the end of the day, what explanation for Majorana’s disappearance could be more soothing than a love story?

The post Le Neutrino de Majorana appeared first on CERN Courier.

]]>
Review Nils Barrellon’s novel Le Neutrino de Majorana is tailor-made for the entertainment of physicists and physics enthusiasts alike.  https://cerncourier.com/wp-content/uploads/2021/06/CCJulAug21_REV_neutrino_feature.jpg
Harnessing the CERN model https://cerncourier.com/a/harnessing-the-cern-model/ Mon, 03 May 2021 09:00:34 +0000 https://preview-courier.web.cern.ch/?p=92159 Experimental physicist Paul Lecoq’s half-century-long career illustrates the power of CERN in fostering international collaboration.

The post Harnessing the CERN model appeared first on CERN Courier.

]]>
Paul Lecoq in China in 1982

CERN’s international relationships are central to its work, and a perfect example of nations coming together for the purpose of peaceful research, regardless of external politics. Through working in China during the 1980s and the Soviet Union/Russia in the early 1990s, physicist Paul Lecoq’s long career is testament to CERN’s influence and standing.

Originally interested in astrophysics, Lecoq completed a PhD in nuclear physics in Montreal in 1972. After finishing his military service, during which he taught nuclear physics at the French Navy School, he came across an advertisement for a fellowship position at CERN. It was the start of a 47-year-long journey with the organisation. “I thought, why not?” Lecoq recalls. “CERN was not my initial target, but I thought it would be a very good place to go. Also, I liked skiing and mountains.”

Royal treatment

During his third year as a fellow, a staff position opened for the upcoming European Hybrid Spectrometer (EHS), which would test CERN’s potential for collaboration beyond its core member states. “The idea was to make a complex multi-detector system, which would be a multi-institute collaboration, with each institute having the responsibility to build one detector,” says Lecoq. One of these institutes was based in Japan, allowing the exchange of personnel. Lecoq was one of the first to benefit from this agreement and, thanks to CERN’s already substantial image, he was very well-received. “At the time, people were travelling much less than now, and Japan was more isolated. I was welcomed by the president of the university and had a very nice reception almost every day.” It was an early sign of things to come for Lecoq.

During the lifetime of the EHS, a “supergroup” of CERN staff was formed whose main role was to support partners across the world while also building part of the experiment. By the time the Large Electron–Positron Collider (LEP) came to fruition it was clear that it would also benefit from this successful approach. At that time, Sam Ting had been asked to propose an experiment for LEP by then Director-General Herwig Schopper, which would become the L3 experiment, and with the EHS coming to an end, says Lecoq, it was natural that the EHS supergroup was transferred to Ting. Through friends working in material science, Lecoq caught wind of the new scintillator crystal (BGO) that was being proposed for L3 – an idea that would see him link up with Ting and spend much of the next few years in China. 

BGO crystals had not yet been used in particle physics, and had only existed in a few small samples, but L3 needed more than 1 m3 of coverage. After sampling and testing the first crystal samples, Lecoq presented his findings at an L3 collaboration meeting. “At the end of the meeting, Ting pointed his finger in my direction and asked if I was free on Saturday. I responded, ‘yes sir’. Then he turned to his secretary and said, ‘book a flight ticket to Shanghai – this guy is coming with me!’”

This is something unique about CERN, where you can meet fantastic people that can completely change your life

Unknown to Lecoq upon his arrival in China, Ting had already prepared the possibility to develop the technology for the mass production of BGO crystals there, and wanted Lecoq to oversee this production. BGO was soon recognised as a crystal that could be produced in large quantities in a reliable and cost-effective way, and it has since been used in a generation of PET scanners. Lecoq was impressed by the authority Ting held in China. “The second day we were in China, we, well Ting, had been invited by the mayor of Shanghai for a dinner to discuss the opportunity for the experiment.” The mayor was Jiang Zemin, who only a few years later became China’s president. “I have been very lucky to have several opportunities like this in my career. This is something unique about CERN, where you can meet fantastic people that can completely change your life. It was also an interesting period when China was slowly opening up to the world – on my first trip everyone was in Mao suits, and in the next three to five years I could see a tremendous change that was so impressive.”

Lecoq’s journeyman career did not stop there. With LEP finishing towards the turn of the millennium and LHC preparations in full swing, his expertise was needed for the production of lead tungstate (PWO) crystals for CMS’s electromagnetic calorimeter. This time, however, Russia was the base of operations, and the 1.2 m3 of BGO crystal for L3 became more than 10 m3 of PWO for CMS. As with his spell in China, Lecoq was in Russia during a politically uncertain time, with his arrival shortly following the fall of the Berlin Wall. “There was no system anymore. But there was still very strong intellectual activity, with scientists at an incredible level, and there was still a lot of production infrastructure for military interest.”

It was interesting not only at the scientific level, but on a human level too

At the time, lithium niobate, a crystal very similar to PWO, was being exploited for radar communication and missile guidance, says Lecoq, and the country had a valuable (but unknown to the public) production-infrastructure in place. With the disarray at the end of the Cold War, the European Commission set up a system, along with Canada, Japan and the US, called the International Science and Technology Center (ISTC), whose role was to transfer the Soviet Union’s military industry into civil application. Lecoq was able to meet with ISTC and gain €7 million in funding to support PWO crystal production for CMS. Again, he stresses, this only happened due to the stature of CERN. “I could not have done that if I had been working only as a French scientist. CERN has the diplomatic contact with the European Commission and different governments, and that made it a lot easier.” Lecoq was responsible for choosing where the crystal production would take place. “These top-level scientists working in the military areas felt isolated, especially in a country that was in a period of collapse, so they were more than happy not only to have an opportunity to do their job under better conditions, but also to have the contacts. It was interesting not only at the scientific level, but on a human level too.”

Crystal clear

Back at CERN, Lecoq realised that introducing a new scintillating crystal, optimising its performance to the harsh operating conditions of the LHC, and developing mass-production technologies to produce large amounts of crystal in a reliable and cost-effective way, was a formidable challenge that could not be dealt with only by particle physicists. Therefore, in 1991, he decided to establish the Crystal Clear multidisciplinary collaboration, gathering experts in material science, crystal-growth, luminescence, solid-state physics and beyond. Here again, he says, the attractiveness of CERN as an internationally recognised research centre was a great help to convince institutes all over the world, some not connected to particle physics at all, to join the collaboration. Crystal Clear is still running today, and celebrating its 30th anniversary. 

Through developing international connections in unexpected places, Lecoq’s career has helped build sustained connections for CERN in some of the world’s largest and fruitfully scientific places. Now retired, he is a distinguished professor at the Polytechnic University in Valencia, where he has set up a public–private partnership laboratory for metamaterial-based scintillators and photodetectors, to aid a new generation of ionisation radiation detectors for medical imaging and other applications. Even now, he is able to flex the muscles of the CERN model by keeping in close contact with the organisation.

“My career at CERN has been extremely rich. I have changed so much in the countries I’ve worked with and the scientific aspect, too. It could only have been possible at CERN.”

The post Harnessing the CERN model appeared first on CERN Courier.

]]>
Careers Experimental physicist Paul Lecoq’s half-century-long career illustrates the power of CERN in fostering international collaboration. https://cerncourier.com/wp-content/uploads/2021/04/CCMayJun21_CAREERS_Lecoq.jpg
Physics flies high at SINP https://cerncourier.com/a/physics-flies-high-at-sinp/ Mon, 03 May 2021 08:59:06 +0000 https://preview-courier.web.cern.ch/?p=92128 Eduard Boos and Victor Savrin look back at 75 years of developments at Russia’s Skobeltsyn Institute of Nuclear Physics.

The post Physics flies high at SINP appeared first on CERN Courier.

]]>
The main building of SINP MSU

The Skobeltsyn Institute of Nuclear Physics (SINP) was established at Lomonosov Moscow State University (MSU) on 1 February 1946, in pursuance of a decree of the government of the USSR. SINP MSU was created as a new type of institute, in which the principles of integrating higher education and fundamental science were prioritised. Its initiator and first director was Soviet physicist Dmitri Vladimirovich Skobeltsyn, who was known for his pioneering use of the cloud chamber to study the Compton effect in 1923 – aiding the discovery of the positron less than a decade later.

It is no coincidence that SINP MSU was established in the immediate aftermath of the Second World War, following the first use of nuclear weapons in conflict. The institute was created on the basis that it would train personnel who would specialise in nuclear science and technology, after the country realised that there was a shortage of specialists in the field. Thanks to strong leadership from Skobeltsyn and one of his former pupils, Sergei Nikolaevich Vernov, SINP MSU quickly gained recognition in the country. As soon as 1949, the government designated it a leading research institute. By this time a 72 cm cyclotron was already in use, the first to be used in a higher education institute in the USSR. 

Skobeltsyn and Vernov continued with their high ambitions as they expanded the facility to the Lenin Hills, along with other scientific departments in MSU. Proposed in 1949 and opened in 1953, the new building in Moscow was granted approval to build a set of accelerators and a special installation for studying extensive air showers (EASs). The first accelerator built there was a 120 cm cyclotron, and its first outstanding scientific achievement was the discovery by A F Tulinov of the so-called “shadow effect” in nuclear reactions on single crystals, which makes it possible to study nuclear reactions at ultra-short time intervals. Significant scientific successes were associated with the commissioning of a unique installation, the EAS-MSU, at the end of the 1950s for the study of ultra-high-energy cosmic rays. Several results were obtained through a new method for studying EASs in the region of 1015–1017 eV, leading to the discovery of the famous “knee” in the energy spectrum of primary cosmic rays.

The space race 

1949 marked SINP MSU’s entrance into astrophysics and, in particular, satellite technology. The USSR’s launch of Sputnik 1, Earth’s first artificial satellite, in 1957 gave Vernov, an enthusiastic experimentalist who had previously researched cosmic rays in the Earth’s stratosphere, the opportunity to study outer-atmosphere cosmic rays. This led to the installation of a Geiger counter on the Sputnik 2 satellite and a scintillation counter on Sputnik 3, to enable radiation experiments. Vernov’s experiments on Sputnik 2 enabled the first detection of the outer radiation belt. However, this was not confirmed until 1958 by the US’s Explorer 1, which carried an instrument designed and built by James Van Allen. Sputnik 3 confirmed the existence of an inner radiation belt, having received information from Australia and South America, as well as from sea-based stations. 

Soyuz carrier rocket

Vernov, who was Skobeltsyn’s successor as SINP director in 1960–1982, later worked on the “Electron” and “Proton” series of satellites, which studied the radiation-belt structure, energy spectra and temporal variations associated with geomagnetic activity. This led to pioneering results on the spectrum and composition of galactic cosmic rays, and to the first model of radiation distribution in near-Earth space in the USSR.

SINP MSU has carried on Vernov’s cosmic legacy by continuing to develop equipment for satellites. Since 2005 the institute has developed its own space programme through the university satellites Tatiana-Universitetsky and Tatiana-2, as well as the Vernov satellite. These satellites led to new phenomena such as ultraviolet flashes from the atmosphere being discovered. In 2016 a tracking system for ultraviolet rays was installed on board the Lomonosov satellite (see “Vernov’s legacy” image), developed at SINP MSU under the guidance of former director Mikhail Igorevich Panasyuk. This allowed fluorescence light radiated by EASs of ultra-high-energy cosmic rays to be measured for the first time, and prompt-emission observations of multi-wavelength gamma-ray bursts. The leading role of the entire mission of the Lomonosov satellite belongs to the current rector of MSU, Victor Sadovnichy.

High-energy exploration 

In 1968, under strong endorsement by Vernov and the director of a new Russian accelerator centre in Protvino, Anatoly Alekseyevich Longunov (who went on to be MSU rector from 1977 to 1991), a department of high-energy physics was established under the leadership of V G Shevchenko at SINP MSU, and the following year it was decided that a high-energy laboratory would be established at MSU. Throughout the years to follow, collaborations with laboratories in USSR and across the world, including CERN, Fermilab, DESY and the Joint Institute for Nuclear Research (JINR), lead the department to be at the forefront of the field. 

At the end of the 1970s a centre was created at SINP MSU for bubble-chamber film analysis. At the time it was one of the largest automated complexes for processing and analysing information from large tracking detectors in the country. In collaboration with other institutes worldwide, staff at the institute studied soft hadronic processes in the energy range 12–350 GeV at a number of large facilities, including the Mirabelle Hydrogen Bubble Chamber and European Hybrid Spectrometer. 

Extensive and unique experimental data have been obtained on the characteristics of multiple hadron productions, including fragmentation distributions. Throughout the years, exclusive reaction channels, angular and momentum correlations of secondary particles, resonance production processes and annihilation processes were also investigated. These results have made it possible to reliably test the predictions of phenomenological models, including the dual-parton model and the quark–gluon string model, based on the fundamental theoretical scheme of dual-topological unitarisation. 

For the first time in Russia, together with a number of scientific and technical enterprises with the leading role of the SINP MSU, an integrated system has now been created for the development, design, mass production and testing of large silicon solid and microstrip detectors. On this basis, at the turn of the millennium a hadron–electron separator was built for the ZEUS experiment at HERA, DESY.

Rolf Heuer visit to Lomonosov Moscow State University

The institute delved into theoretical studies in 1983, with the establishment of the laboratory of symbolic computations in high-energy physics and, in 1990, the department of theoretical high-energy physics. One of its most striking achievements was the creation of the CompHEP software package, which has received global recognition for its ability to automate calculations of collisions between elementary particles and their decays within the framework of gauge theories. This is freely available and allows physicists (even those with little computer experience) to calculate cross sections and construct various distributions for collision processes within the Standard Model and its extensions. Members of the department later went on to make a significant contribution to the creation of a Tier-2 Grid computer segment in Russia for processing and storing data from the LHC detectors.

Over the past 35 years of research in the field of particle accelerators at SINP MSU, research has moved from the development of large accelerator complexes for fundamental research, to now focusing on the creation and production of applied accelerators for security systems, industry and medicine.

Teaching legacy

Throughout its 75 years, SINP MSU has also nurtured thousands of students. In 1961 a new branch of SINP MSU, the department for nuclear research, was established in Dubna. It became the basis for training students from the MSU physics faculty in nuclear physics using the capabilities of the largest international scientific centre in Russia – JINR. The department, which is still going strong today, teaches with a hands-on approach, with students attending lectures by leading JINR scientists and taking part in practical training held at the JINR laboratories.

The institute is currently participating in the upgrade of the LHC detectors (CMS, ATLAS, LHCb) for the HL-LHC project, as well as in projects within the Physics Beyond Colliders initiative (e.g. NA64, SHiP). These actions are under the umbrella of a 2019 cooperation agreement between CERN and Russia concerning high-energy physics and other domains of mutual interest. Looking even further ahead, SINP MSU scientists are also working on the development of research programmes for future collider projects such as the FCC, CLIC and ILC. Furthermore, the institute is involved in the upcoming NICA Complex in Russia, which plans to finish construction in 2022.

After 75 years, the institute is still as relevant as ever, and whatever the next chapter of particle physics will be, SINP MSU will be involved.

The post Physics flies high at SINP appeared first on CERN Courier.

]]>
Feature Eduard Boos and Victor Savrin look back at 75 years of developments at Russia’s Skobeltsyn Institute of Nuclear Physics. https://cerncourier.com/wp-content/uploads/2021/04/CCMayJun21_RUSSIA_feature.jpg
Scientific journeys of a “Sputnik kid” https://cerncourier.com/a/scientific-journeys-a-physicist-explores-the-culture-history-and-personalities-of-science/ Fri, 29 Jan 2021 12:52:35 +0000 https://preview-courier.web.cern.ch/?p=90824 Frederick Dylla's debut book puts a multidisciplinary historical perspective on the actors and events that shaped science.

The post Scientific journeys of a “Sputnik kid” appeared first on CERN Courier.

]]>
Scientific journeys

H Frederick Dylla is a “Sputnik kid”, whose curiosity and ingenuity led him on a successful 50-year career in physics, from plasma to accelerators and leading the American Institute of Physics. His debut book, Scientific Journeys: A Physicist Explores the Culture, History and Personalities of Science, is a collection of essays that puts a multidisciplinary historical perspective on the actors and events that shaped the world of science and scholarly publishing. Through geopolitical and economic context and a rich record of key events, he highlights innovations that have found their use in social and business applications. Those cited as having contributed to global technological progress range from the web and smartphones to medical imaging and renewable energy.

Dylla begins with the story of medieval German abbess, mystic, composer and medicinal botanist Hildegard of Bingen

The book is divided in five chapters: “signposts” (in the form of key people and events in scientific history); mentors and milestones in his life; science policy; communicating science; and finally a brief insight into the relationship between science and art. He begins with the story of medieval German abbess, mystic, composer and medicinal botanist Hildegard of Bingen: “a bright signpost of scholarship”. Dylla goes on to explore the idea that a single individual at the right time and place can change the course of history. Bounding through the centuries, he highlights the importance of science policy and science communication, the funding of big and small science alike, and the contemporary challenges linked to research, teaching science and scholarly publishing. Examples among these, says Dylla, are the protection of scientific integrity, new practices of distance learning and the weaknesses of the open-access model. The book ends bang up to date with a thought on the coronavirus pandemic and science’s key role in overcoming it.

Intended for teachers, science historians and students from high school to graduate school, Dylla’s book puts a face on scientific inventions. The weightiest chapter, mentors and milestones, focuses on personalities who have played an important role in his scientific voyage. Among the many named, however, Mildred Dresselhaus – the “queen of carbon” – is the only female scientist featured in the book besides Hildegard. Though by beginning the book with a brilliant but at best scientifically adjacent abbess who preceded Galileo by four centuries Dylla tacitly acknowledges the importance of representing diversity, the book unintentionally makes it discomfortingly clear how scarce role models for women can be in the white-male dominated world of science. The lack of a discussion on diversity is a missed opportunity in an otherwise excellent book.

The post Scientific journeys of a “Sputnik kid” appeared first on CERN Courier.

]]>
Review Frederick Dylla's debut book puts a multidisciplinary historical perspective on the actors and events that shaped science. https://cerncourier.com/wp-content/uploads/2021/01/CCJanFeb21_REVIEWS_dyalla_feature.jpg
Discovery machines https://cerncourier.com/a/discovery-machines/ Wed, 27 Jan 2021 15:07:53 +0000 https://preview-courier.web.cern.ch/?p=90971 50 years ago CERN’s Intersecting Storage Rings set in motion a series of hadron colliders charting nature at the highest possible energies.

The post Discovery machines appeared first on CERN Courier.

]]>
CERN’s Intersecting Storage Rings in 1974

The ability to collide high-energy beams of hadrons under controlled conditions transformed the field of particle physics. Until the late 1960s, the high-energy frontier was dominated by the great proton synchrotrons. The Cosmotron at Brookhaven National Laboratory and the Bevatron at Lawrence Berkeley National Laboratory were soon followed by CERN’s Proton Synchrotron and Brookhaven’s Alternating Gradient Synchrotron, and later by the Proton Synchrotron at Serpukov near Moscow. In these machines protons were directed to internal or external targets in which secondary particles were produced.

The kinematical inefficiency of this process, whereby the centre-of-mass energy only increases as the square root of the beam energy, was recognised from the outset. In 1943, Norwegian engineer Rolf Widerøe proposed the idea of colliding beams, keeping the centre of mass at rest in order to exploit the full energy for the production of new particles. One of the main problems was to get colliding beam intensities high enough for a useful event rate to be achieved. In the 1950s the prolific group at the University of Wisconsin Midwestern Universities Research Association (MURA), led by Donald Kerst, worked on the problem of “stacking” particles, whereby successive pulses from an injector synchrotron are superposed to increase the beam intensity. They mainly concentrated on protons, where Liouville’s theorem (which states that for a continuous fluid under the action of conservative forces the density of phase space cannot be increased) was thought to apply. Only much later, ways to beat Liouville and to increase the beam density were found. At the 1956 International Accelerator Conference at CERN, Kerst made the first proposal to use stacking to produce colliding beams (not yet storage rings) of sufficient intensity.

SppS in 1983

At that same conference, Gerry O’Neill from Princeton presented a paper proposing that colliding electron beams could be achieved in storage rings by making use of the natural damping of particle amplitudes by synchrotron-radiation emission. A design for the 500 MeV Princeton–Stanford colliding beam experiment was published in 1958 and construction started that same year. At the same time, the Budker Institute for Nuclear Research in Novosibirsk started work on VEP-1, a pair of rings designed to collide electrons at 140 MeV. Then, in March 1960, Bruno Touschek gave a seminar at Laboratori Nazionali di Frascati in Italy where he first proposed a single-ring, 0.6 m-circumference 250 MeV electron–positron collider. “AdA” produced the first stored electron and positron beams less than one year later – a far cry from the time it takes today’s machines to go from conception to operation! From these trailblazers evolved the production machines, beginning with ADONE at Frascati and SPEAR at SLAC. However, it was always clear that the gift of synchrotron-radiation damping would become a hindrance to achieving very high energy collisions in a circular electron–positron collider because the power radiated increases as the fourth power of the beam energy and the inverse fourth power of mass, so is negligible for protons compared with electrons.

A step into the unknown

Meanwhile, in the early 1960s, discussion raged at CERN about the next best step for particle physics. Opinion was sharply divided between two camps, one pushing a very high-energy proton synchrotron for fixed-target physics and the other using the technique proposed at MURA to build an innovative colliding beam proton machine with about the same centre-of-mass energy as a conventional proton synchrotron of much larger dimensions. In order to resolve the conflict, in February 1964, 50 physicists from among Europe’s best met at CERN. From that meeting emerged a new committee, the European Committee for Future Accelerators, under the chairmanship of one of CERN’s founding fathers, Edoardo Amaldi. After about two years of deliberation, consensus was formed. The storage ring gained most support, although a high-energy proton synchrotron, the Super Proton Synchrotron (SPS), was built some years later and would go on to play an essential role in the development of hadron storage rings. On 15 December 1965, with the strong support of Amaldi, the CERN Council unanimously approved the construction of the Intersecting Storage Rings (ISR), launching the era of hadron colliders.

On 15 December 1965, the CERN Council unanimously approved the construction of the ISR, launching the era of hadron colliders

First collisions

Construction of the ISR began in 1966 and first collisions were observed on 27 January 1971. The machine, which needed to store beams for many hours without the help of synchrotron-radiation damping to combat inevitable magnetic field errors and instabilities, pushed the boundaries in accelerator science on all fronts. Several respected scientists doubted that it would ever work. In fact, the ISR worked beautifully, exceeding its design luminosity by an order of magnitude and providing an essential step in the development of the next generation of hadron colliders. A key element was the performance of its ultra-high-vacuum system, which was a source of continuous improvement throughout the 13 year-long lifetime of the machine.

For the experimentalists, the ISR’s collisions (which reached an energy of 63 GeV) opened an exciting adventure at the energy frontier. But they were also learning what kind of detectors to build to fully exploit the potential of the machine – a task made harder by the lack of clear physics benchmarks known at the time in the ISR energy regime. The concept of general-purpose instruments built by large collaborations, as we know them today, was not in the culture of the time. Instead, many small collaborations built experiments with relatively short lifecycles, which constituted a fruitful learning ground for what was to come at the next generation of hadron colliders.

There was initially a broad belief that physics action would be in the forward directions at a hadron collider. This led to the Split Field Magnet facility as one of the first detectors at the ISR, providing a high magnetic field in the forward directions but a negligible one at large angle with respect to the colliding beams (the nowadays so-important transverse direction). It was with subsequent detectors featuring transverse spectrometer arms over limited solid angles that physicists observed a large excess of high transverse momentum particles above low-energy extrapolations. With these first observations of point-like parton scattering, the ISR made a fundamental contribution to strong-interaction physics. Solid angles were too limited initially, and single-particle triggers too biased, to fully appreciate the hadronic jet structure. That feat required third-generation detectors, notably the Axial Field Spectrometer (AFS) at the end of the ISR era, offering full azimuthal central calorimeter coverage. The experiment provided evidence for the back-to-back two-jet structure of hard parton scattering.

The Tevatron at Fermilab in 2011

For the detector builders, the original AFS concept was interesting as it provided an unobstructed phi-symmetric magnetic field in the centre of the detector, however, at the price of massive Helmholtz coil pole tips obscuring the forward directions. Indeed, the ISR enabled the development of many original experimental ideas. A very important one was the measurement of the total cross section using very forward detectors in close proximity to the beam. These “Roman Pots”, named for their inventors, made their appearance in all later hadron colliders, confirming the rising total pp cross section with energy.

It is easy to say after the fact, still with regrets, that with an earlier availability of more complete and selective (with electron-trigger capability) second- and third-generation experiments at the ISR, CERN would not have been left as a spectator during the famous November revolution of 1974 with the J/ψ discoveries at Brookhaven and SLAC. These, and the ϒ resonances discovered at Fermilab three years later, were clearly observed in the later-generation ISR experiments.

SPS opens new era

However, events were unfolding at CERN that would pave the way to the completion of the Standard Model. At the ISR in 1972, the phenomenon of Schottky noise (density fluctuations due to the granular nature of the beam in a storage ring) was first observed. It was this very same noise that Simon van der Meer speculated in a paper a few years earlier could be used for what he called “stochastic cooling” of a proton beam, beating Liouville’s theorem by the fact that a beam of particles is not a continuous fluid. Although it is unrealistic to detect the motion of individual particles and damp them to the nominal orbit, van der Meer showed that by correcting the mean transverse motion of a sample of particles continuously, and as long as the statistical nature of the Schottky signal was continuously regenerated, it would be theoretically possible to reduce the beam size and increase its density. With the bandwidth of electronics available at the time, van der Meer concluded that the cooling time would be too long to be of practical importance. But the challenge was taken up by Wolfgang Schnell, who built a state-of-the-art feedback system that demonstrated stochastic cooling of a proton beam for the first time. This would open the door to the idea of stacking and cooling of antiprotons, which later led to the SPS being converted into a proton–antiproton collider.

The Large Hadron Collider in 2018

Another important step towards the next generation of hadron colliders occurred in 1973 when the collaboration working on the Gargamelle heavy-liquid bubble chamber published two papers revealing the first evidence for weak neutral currents. These were important observations in support of the unified theory of electromagnetic and weak interactions, for which Sheldon Glashow, Abdus Salam and Steven Weinberg were to receive the Nobel Prize in Physics in 1979. The electroweak theory predicted the existence and approximate masses of two vector bosons, the W and the Z, which were too high to be produced in any existing machine. However, Carlo Rubbia and collaborators proposed that, if the SPS could be converted into a collider with protons and antiprotons circulating in opposite directions, there would be enough energy to create them.

To achieve this the SPS would need to be converted into a storage ring like the ISR, but this time the beam would need to be kept “bunched” with the radio-frequency (RF) system working continuously to achieve a high enough luminosity (unlike the ISR where the beams were allowed to de-bunch all around the ring). The challenges here were two-fold. Noise in the RF system causes particles to diffuse rapidly from the bunch. This was solved by a dedicated feedback system. It was also predicted that the beam–beam interaction would limit the performance of a bunched-beam machine with no synchrotron-radiation damping due to the strongly nonlinear interactions between a particle in one beam with the global electromagnetic field in the other beam.

A much bigger challenge was to build an accumulator ring in which antiprotons could be stored and cooled by stochastic cooling until a sufficient intensity of antiprotons would be available to transfer into the SPS, accelerate to around 300 GeV and collide with protons. This was done in two stages. First a proof-of-principle was needed to show that the ideas developed at the ISR transferred to a dedicated accumulator ring specially designed for stochastic cooling. This ring was called the Initial Cooling Experiment (ICE), and operated at CERN in 1977–1978. In ICE transverse cooling was applied to reduce the beam size and a new technique for reducing the momentum spread in the beam was developed. The experiment proved to be a big success and the theory of stochastic cooling was refined to a point where a real accumulator ring (the Antiproton Accumulator) could be designed to accumulate and store antiprotons produced at 3.5 GeV by the proton beam from the 26 GeV Proton Synchrotron. First collisions of protons and antiprotons at 270 GeV were observed on the night of 10 July 1981, signalling the start of a new era in colliding beam physics.

The R702 experiment

A clear physics goal, namely the discovery of the W and Z intermediate vector bosons, drove the concepts for the two main SppS experiments UA1 and UA2 (in addition to a few smaller, specialised experiments). It was no coincidence that the leaders of both collaborations were pioneers of ISR experiments, and many lessons from the ISR were taken on board. UA1 pioneered the concept of a hermetic detector that covered as much as possible the full solid angle around the interaction region with calorimetry and tracking. This allows measurements of the missing transverse energy/momentum, signalling the escaping neutrino in the leptonic W decays. Both electrons and muons were measured, with tracking in a state-of-the-art drift chamber that provided bubble-chamber-like pictures of the interactions. The magnetic field was provided by a dipole-magnet configuration, an approach not favoured in later generation experiments because of its inherent lack of azimuthal symmetry. UA2 featured a (at the time) highly segmented electromagnetic and hadronic calorimeter in the central part (down to 40 degrees with respect to the beam axis), with 240 cells pointing to the interaction region. But it had no muon detection, and in its initial phase only limited electromagnetic coverage in the forward regions. There was no magnetic field except for the forward cones with toroids to probe the W polarisation.

In 1983 the SppS experiments made history with the direct discoveries of the W and Z. Many other results were obtained, including the first evidence of neutral B-meson particle–antiparticle mixing at UA1 thanks to its tracking and muon detection. The calorimetry of UA2 provided immediate unambiguous evidence for a two-jet structure in events with large transverse energy. Both UA1 and UA2 pushed QCD studies far ahead. The lack of hermeticity in UA2’s forward regions motivated a major upgrade (UA2′) for the second phase of the collider, complementing the central part with new fully hermetic calorimetry (both electromagnetic and hadronic), and also inserting a new tracking cylinder employing novel technologies (fibre tracking and silicon pad detectors). This enabled the experiment to improve searches for top quarks and supersymmetric particles, as well as making almost background-free first precision measurements of the W mass.

Meanwhile in America

At the time the SppS was driving new studies at CERN, the first large superconducting synchrotron (the Tevatron, with a design energy close to 1 TeV) was under construction at Fermilab. In view of the success of the stochastic cooling experiments, there was a strong lobby at the time to halt the construction of the Tevatron and to divert effort instead to emulate the SPS as a proton–antiproton collider using the Fermilab Main Ring. Wisely this proposal was rejected and construction of the Tevatron continued. It came into operation as a fixed-target synchrotron in 1984. Two years later it was also converted into a proton–antiproton collider and operated at the high-energy frontier until its closure in September 2011.

The UA1 detector

A huge step was made with the detector concepts for the Tevatron experiments, in terms of addressed physics signatures, sophistication and granularity of the detector components. This opened new and continuously evolving avenues in analysis methods at hadron colliders. Already the initial CDF and DØ detectors for Run I (which lasted until 1996) were designed with cylindrical concepts, characteristic of what we now call general-purpose collider experiments, albeit DØ still without a central magnetic field in contrast to CDF’s 1.4 T solenoid. In 1995 the experiments delivered the first Tevatron highlight: the discovery of the top quark. Both detectors underwent major upgrades for Run II (2001–2011) – a theme now seen for the LHC experiments – which had a great impact on the Tevatron’s physics results. CDF was equipped with a new tracker, a silicon vertex detector, new forward calorimeters and muon detectors, while DØ added a 1.9 T central solenoid, vertexing and fibre tracking, and new forward muon detectors. Alongside the instrumentation was a breath-taking evolution in real-time event selection (triggering) and data acquisition to keep up with the increasing luminosity of the collider.

The physics harvest of the Tevatron experiments during Run II was impressive, including a wealth of QCD measurements and major inroads in top-quark physics, heavy-flavour physics and searches for phenomena beyond the Standard Model. Still standing strong are its precision measurements of the W and top masses and of the electroweak mixing angle sin2θW. The story ended in around 2012 with a glimpse of the Higgs boson in associated production with a vector boson. The CDF and DØ experience influenced the LHC era in many ways: for example they were able to extract the very rare single-top production cross-section with sophisticated multivariate algorithms, and they demonstrated the power of combining mature single-experiment measurements in common analyses to achieve ultimate precision and sensitivity.

For the machine builders, the pioneering role of the Tevatron as the first large superconducting machine was also essential for further progress. Two other machines – the Relativistic Heavy Ion Collider at Brookhaven and the electron–proton collider HERA at DESY – derived directly from the experience of building the Tevatron. Lessons learned from that machine and from the SppS were also integrated into the design of the most powerful hadron collider yet built: the LHC.

The Large Hadron Collider

The LHC had a difficult birth. Although the idea of a large proton–proton collider at CERN had been around since at least 1977, the approval of the Superconducting Super Collider (SSC) in the US in 1987 put the whole project into doubt. The SSC, with a centre-of-mass energy of 40 TeV, was almost three times more powerful than what could ever be built using the existing infrastructure at CERN. It was only the resilience and conviction of Carlo Rubbia, who shared the 1984 Nobel Prize in Physics with van der Meer for the project leading to the discovery of the W and Z bosons, that kept the project alive. Rubbia, who became Director-General of CERN in 1989, argued that, in spite of its lower energy, the LHC could be competitive with the SSC by having a luminosity an order of magnitude higher, and at a fraction of the cost. He also argued that the LHC would be more versatile: as well as colliding protons, it would be able to accelerate heavy ions to record energies at little extra cost.

The Tevatron’s CDF detector

The SSC was eventually cancelled in 1993. This made the case for the LHC even stronger, but the financial climate in Europe at the time was not conducive to the approval of a large project. For example, CERN’s largest contributor, Germany, was struggling with the cost of reunification and many other countries were getting to grips with the introduction of the single European currency. In December 1993 a plan was presented to the CERN Council to build the machine over a 10-year period by reducing the other experimental programmes at CERN to the absolute minimum, with the exception of the full exploitation of the flagship Large Electron Positron (LEP) collider. Although the plan was generally well received, it became clear that Germany and the UK were unlikely to agree to the budget increase required. On the positive side, after the demise of the SSC, a US panel on the future of particle physics recommended that “the government should declare its intentions to join other nations in constructing the LHC”. Positive signals were also being received from India, Japan and Russia.

In June 1994 the proposal to build the LHC was made once more. However, approval was blocked by Germany and the UK, which demanded substantial additional contributions from the two host states, France and Switzerland. This forced CERN to propose a “missing magnet” machine where only two thirds of the dipole magnets would be installed in a first stage, allowing operation at reduced energy for a number of years. Although costing more in the long run, the plan would save some 300 million Swiss Francs in the first phase. This proposal was put to Council in December 1994 by the new Director-General Christopher Llewellyn Smith and, after a round of intense discussions, the project was finally approved for two-stage construction, to be reviewed in 1997 after non-Member States had made known their contributions. The first country to do so was Japan in 1995, followed by India, Russia and Canada the next year. A final sting in the tail came in June 1996 when Germany unilaterally announced that it intended to reduce its CERN subscription by between 8% and 9%, prompting the UK to demand a similar reduction and forcing CERN to take out loans. At the same time, the two-stage plan was dropped and, after a shaky start, the construction of the full LHC was given the green light.

The fact that the LHC was to be built at CERN, making full use of the existing infrastructure to reduce cost, imposed a number of strong constraints. The first was the 27 km-circumference of the LEP tunnel in which the machine was to be housed. For the LHC to achieve its design energy of 7 TeV per beam, its bending magnets would need to operate at a field of 8.3 T, about 60% higher than ever achieved in previous machines. This could only be done using affordable superconducting material by reducing the temperature of the liquid-helium coolant from its normal boiling point of 4.2 K to 1.9 K – where helium exists in a macroscopic quantum state with the loss of viscosity and a very large thermal conductivity. A second major constraint was the small (3.8 m) tunnel diameter, which made it impossible to house two independent rings like the ISR. Instead, a novel and elegant magnet design, first proposed by Bob Palmer at Brookhaven, with the two rings separated by only 19 cm in a common yoke and cryostat was developed. This also considerably reduced the cost.

This journey is now poised to continue, as we look ahead towards how a general-purpose detector at a future 100 TeV hadron collider might look like

At precisely 09:30 on 10 September 2008, almost 15 years after the project’s approval, the first beam was injected into the LHC, amid global media attention. In the days that followed good progress was made until disaster struck: during a ramp to full energy, one of the 10,000 superconducting joints between the magnets failed, causing extensive damage which took more than a year to recover from. Following repairs and consolidation, on 29 November 2009 beam was once more circulating and full commissioning and operation could start. Rapid progress in ramping up the luminosity followed, and the LHC physics programme, at an initial energy of 3.5 TeV per beam, began in earnest in March 2010.

LHC experiments

Yet a whole other level of sophistication was realised by the LHC detectors compared to those at previous colliders. The priority benchmark for the designs of the general-purpose detectors ATLAS and CMS was to unambiguously discover (or rule out) the Standard Model Higgs boson for all possible masses up to 1 TeV, which demanded the ability to measure a variety of final states. The challenges for the Higgs search also guaranteed the detectors’ potential for all kinds of searches for physics beyond the Standard Model, which was the other driving physics motivation at the energy frontier. These two very ambitious LHC detector designs integrated all the lessons learned from the experiments at the three predecessor machines, as well as further technology advances in other large experiments, most notably at HERA and LEP.

Just a few simple numbers illustrate the giant leap from the Tevatron to the LHC detectors. CDF and DØ, in their upgraded versions operating at a luminosity of up to 4 × 1032 cm–2s–1, typically had around a million channels and a triggered event rate of 100 Hz, with event sizes of 500 kB. The collaborations were each about 600 strong. By contrast, ATLAS and CMS operated during LHC Run 2 at a luminosity of 2 × 1034 cm–2s–1 with typically 100 million readout channels, and an event rate and size of 500 Hz and 1500 kB. Their publications have close to 3000 authors.

For many major LHC-detector components, complementary technologies were selected. This is most visible for the superconducting magnet systems, with an elegant and unique large 4 T solenoid in CMS serving both the muon and inner tracking measurements, and an air-core toroid system for the muon spectrometer in ATLAS together with a 2 T solenoid around the inner tracking cylinder. These choices drove the layout of the active detector components, for instance the electromagnetic calorimetry. Here again, different technologies were implemented: a novel-configuration liquid-argon sampling calorimeter for ATLAS and lead-tungstate crystals for CMS.

From the outset, the LHC was conceived as a highly versatile collider facility, not only for the exploration of high transverse-momentum physics. With its huge production of b and c quarks, it offered the possibility of a very fruitful programme in flavour physics, exploited with great success by the purposely designed LHCb experiment. Furthermore, in special runs the LHC provides heavy-ion collisions for studies of the quark–gluon plasma – the field of action for the ALICE experiment.

As the general-purpose experiments learned from the history of experiments in their field, the concepts of both LHCb and ALICE also evolved from a previous generation of experiments in their fields, which would be interesting to trace back. One remark is due: the designs of all four main detectors at the LHC have turned out to be so flexible that there are no strict boundaries between these three physics fields for them. All of them have learned to use features of their instruments to contribute at least in part to the full physics spectrum offered by the LHC, of which the highlight so far was the July 2012 announcement of the discovery of the Higgs boson by the ATLAS and CMS collaborations. The following year the collaborations were named in the citation for the 2013 Nobel Prize in Physics awarded to François Englert and Peter Higgs.

CMS experiment

Since then, the LHC has exceeded its design luminosity by a factor of two and delivered an integrated luminosity of almost 200 fb–1 in proton–proton collisions, while its beam energy was increased to 6.5 TeV in 2015. The machine has also delivered heavy ion (lead–lead) and even lead–proton collisions. But the LHC still has a long way to go before its estimated end of operations in the mid-to-late 2030s. To this end, the machine was shut down in November 2018 for a major upgrade of the whole of the CERN injector complex as well as the detectors to prepare for operation at high luminosities, ultimately up to a “levelled” luminosity of 7 × 1034 cm–2s–1. The High Luminosity LHC (HL-LHC) upgrade is pushing the boundaries of superconducting magnet technology to the limit, particularly around the experiments where the present focusing elements will be replaced by new magnets built from high-performance Nb3Sn superconductor. The eventual objective is to accumulate 3000 fb–1 of integrated luminosity.

In parallel, the LHC-experiment collaborations are preparing and implementing major upgrades to their detectors using novel state-of-art technologies and revolutionary approaches to data collection to exploit the tenfold data volume promised by the HL-LHC. Hadron-collider detector concepts have come a long way in sophistication over the past 50 years. However, behind the scenes are other factors paramount to their success. These include an equally spectacular evolution in data-flow architectures, software and the computing approaches, and analysis methods – all of which have been driven into new territories by the extraordinary needs for dealing with rare events within the huge backgrounds of ordinary collisions at hadron colliders. Worthy of particular mention in the success of all LHC physics results is the Worldwide LHC Computing Grid. This journey is now poised to continue, as we look ahead towards how a general-purpose detector at a future 100 TeV hadron collider might look like.

Beyond the LHC

Although the LHC has at least 15 years of operations ahead of it, the question now arises, as it did in 1964: what is the next step for the field? The CERN Council has recently approved the recommendations of the 2020 update of the European strategy for particle physics, which includes, among other things, a thorough study of a very high-energy hadron collider to succeed the LHC. A technical and financial feasibility study for a 100 km circular collider at CERN with a collision energy of at least 100 TeV is now under way. While a decision to proceed with such a facility is to come later this decade, one thing is certain: lessons learned from 50 years of experience with hadron colliders and their detectors will be crucial to the success of our next step into the unknown.

The post Discovery machines appeared first on CERN Courier.

]]>
Feature 50 years ago CERN’s Intersecting Storage Rings set in motion a series of hadron colliders charting nature at the highest possible energies. https://cerncourier.com/wp-content/uploads/2021/01/CCMarApr21_50YEARS_feature.jpg
A decade in LHC publications https://cerncourier.com/a/a-decade-in-lhc-publications/ Thu, 14 Jan 2021 14:26:54 +0000 https://preview-courier.web.cern.ch/?p=90589 The first ten years of LHC operations have generated a bumper crop of new knowledge.

The post A decade in LHC publications appeared first on CERN Courier.

]]>
In June 2020, the CMS collaboration submitted a paper titled “Observation of the production of three massive gauge bosons at √s= 13 TeV” to the arXiv preprint server. A scientific highlight in its own right, the paper also marked the collaboration’s thousandth publication. ATLAS is not far from reaching the same milestone, currently at 964 publications. With the rest of the LHC experiments taking the total number of papers to 2852, the first ten years of LHC operations have generated a bumper crop of new knowledge about the fundamental particles and interactions.

The publication landscape in high-energy physics (HEP) is very exceptional due to a long-held preprint culture. From the 1950s paper copies were kept in the well-known red cabinets outside the CERN Library (pictured), but since 1991 they have been stored electronically at arXiv.org. Preprint posting and actual journal publication tend to happen in parallel, and citations between all types of publications are compiled and counted in the INSPIRE system.

2852 papers in one picture

Particle physics has been at the forefront of the open-science movement, in publishing, software, hardware and, most recently, data. In 2004, former Director-General Robert Aymar encouraged the creation of SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) at CERN. Devoted to converting closed access HEP journals to open access, it has grown extensively and now has over 3000 libraries from 44 different countries. All original LHC research results have been published open access. The first collaboration articles by the four main experiments, describing the detector designs, and published in the Journal of Instrumentation, remain amongst the most cited articles from LHC collaborations and — despite being more than a decade old — are some of the most recently read articles of the journal.

Closer analysis
Since then, along with the 2852 publications by CERN’s LHC experiments, a further 380 papers have been written by individuals on behalf of the collaboration, and another 10,879 articles (preprints, conference proceedings, etc.) from the LHC experiments that were not published in a journal. However, this only represents part of the scientific relevance of the LHC. There were tens of thousands of papers published over the past decade that write about the LHC experiments, use their data or are based on the LHC findings. The papers published by the four experiments received on average 112 citations per paper, compared to an average of 41 citations per paper across all experimental papers indexed in INSPIRE and even 30 citations per paper across all HEP publications (4.8 million citations across 163,000 documents since 2008). Unsurprisingly, the number of citations peaks with the CMS and ATLAS papers on the Higgs discovery, with 10,910 and 11,195 citations respectively, which at the end of 2019 were the two most cited high-energy physics papers released in the past decade.

Large author numbers are another exceptional aspect of LHC-experiment publishing, with papers consistently carrying hundreds or even thousands of names. This culminated in a world record of 5,154 authors on a joint paper between CMS and ATLAS in 2015, which reduced the uncertainty on the measurement of the Higgs-boson mass to ±0.25%.

750 shades of model building

Teasing fluctuations
Ten years of LHC publications have established the Standard Model at unprecedented levels of precision. But they also reveal the hunger for new physics, as illustrated by the story of the 750 GeV diphoton ‘bump’. On 15 December 2015, ATLAS and CMS presented an anomaly in data that showed an excess of events at 750 GeV in proton collisions, fueling rumours a new particle could be showing itself. While the significance of the excess was only 2σ and 1.6σ respectively, theorists were quick to respond with an influx of hundreds of papers (see “750 shades of model building”). This excitement was however damped by the release of the August 2016 data, where there was no further sign of the anomaly, and it became commonly recognised as a statistical fluctuation – part and parcel of the scientific process, if ruining the fun for the theorists.

With the LHC to continue operations to the mid-2030s, and only around 6% of its expected total dataset collected so far, we can look forward to thousands more publications about nature’s basic constituents being placed in the public domain.

All numbers are correct of 7 January.

The post A decade in LHC publications appeared first on CERN Courier.

]]>
Feature The first ten years of LHC operations have generated a bumper crop of new knowledge. https://cerncourier.com/wp-content/uploads/2021/01/2012-11-06Bldg52_Library1-3-e1610702967818.jpg
Russia’s particle-physics powerhouse https://cerncourier.com/a/russias-particle-physics-powerhouse/ Fri, 18 Dec 2020 14:11:16 +0000 https://preview-courier.web.cern.ch/?p=90416 The Institute for Nuclear Research in Moscow celebrates its 50th anniversary.

The post Russia’s particle-physics powerhouse appeared first on CERN Courier.

]]>
Timeline of INR RAS

Founded on 24 December 1970, the Institute for Nuclear Research of the Russian Academy of Sciences (INR RAS) is a large centre for particle physics in Moscow with wide participation in international projects. The INR RAS conducts work on cosmology, neutrino physics, astrophysics, high-energy physics, accelerator physics and technology, neutron research and nuclear medicine. It is most well-known for its unique research facilities that are spread all across Russia, and its large-scale collaborations in neutrino and high-energy physics. This includes experiments such as the Baksan Neutrino Observatory, and collaborations with a number of CERN experiments including CMS, ALICE, LHCb, NA61 and NA64.

The Institute was founded by the Decree of the Presidium of the USSR Academy of Sciences in accordance with the decision of the government. Theoretical physicist Moisey Markov had a crucial role in establishing the Institute and influenced the research that would later be undertaken. His ambition is seen in the decision to base INR RAS on three separate nuclear laboratories of the P.N. Lebedev Institute of Physics of the Academy of Sciences of the USSR. Each laboratory had a leading physicist in charge: the Atomic Nucleus Laboratory headed by Nobel laureate Ilya Frank; the Photonuclear Reactions Laboratory under the direction of Lyubov Lazareva; and a neutrino laboratory headed by Georgy Zatsepin and Alexander Chudakov. The man overseeing it all was the first director of INR RAS, Albert Tavkhelidze, a former researcher at the Joint Institute for Nuclear Research (JINR, Dubna). In 1987 he was replaced as director by Victor Matveev, then in 2014 by Leonid Kravchuk. Since 2020 the director of INR RAS is Maxim Libanov.

It (Troitsk) has the most powerful linear proton accelerator in the Euro-Asian region

From the very beginning, major efforts were focused on the construction and operation of large-scale research facilities. The hub of INR RAS was built 20 km outside of Moscow, in a town called Troitsk. In 1973 an accelerator division was created, with a long-term goal of creating a meson facility that would house a 600 MeV linear accelerator for protons and H- ions. The first beam was eventually accelerated to 20 MeV in 1988 and the facility was fully operational by 1993. Now known as the Moscow Meson Facility, it has the most powerful linear proton accelerator in the Euro-Asian region, providing fundamental and applied research in nuclear and neutron physics, condensed matter, development of technologies for the production of a wide range of radioisotopes, operation of a radiation therapy complex and many other applications.

A town called Neutrino
Over 1000 miles south from the Troitsk laboratory, an underground tunnel in the Caucasus mountains is the base of another INR RAS facility, the Baksan Neutrino Observatory (BNO). The facility was established in 1967 and the Baksan Underground Scintillation Telescope (BUST) started taking data in 1978. A town sensibly called “Neytrino” (Russian for neutrino) was constructed in parallel with the facility, and was where scientists and their families could live 1700 m above sea level next to the observatory. In 1987 BUST was one of the four neutrino detectors to first directly observe neutrinos from supernova SN1987A.

The observatory did not finish there, and the next step was the gallium-germanium neutrino telescope (GGNT), which was home to the Soviet–American Gallium Experiment (SAGE). The experiment contributed heavily towards solving the solar neutrino problem and simultaneously gave rise to a new problem known as the gallium anomaly, which is yet to be explained. SAGE is still well and truly alive, and with a recent upgrade of the GGNT completed in 2019, the team will now hunt for sterile neutrinos.

Modules being installed in Baikal-GVD

By 1990 another neutrino detector was under construction, following the original proposal of Markov and Chudakov. In collaboration with JINR, plans for an underwater neutrino telescope located at the world’s largest freshwater lake, Lake Baikal, took shape. Underwater telescopes use glass spheres that house photomultiplier tubes to detect Cherenkov light from the charged particles emerging from neutrino interactions in the lake water. The first detector developed for Lake Baikal was the NT200, which was constructed over five years from 1993–1998 and detected cosmic neutrinos for more than a decade. It has now been replaced with the Gigaton Volume Detector (Baikal-GVD), and plans were concluded in 2015 for the first phase of the telescope to be completed by 2021. Baikal-GVD has an effective volume of 1 km3 and is designed to register and study ultrahigh-energy neutrino fluxes from astrophysical sources.

Left a mark
There is no doubt that INR RAS has left its mark on high-energy physics. While the Institute’s most recognised work will be in neutrino physics, the Moscow Meson Facility has also contributed largely to other areas of the field. An experiment was created for direct measurement of the mass of the electron antineutrino via the beta decay of tritium. The “Troitsk nu-mass” experiment started in 1985 and its limit on the electron antineutrino mass was the world’s best for years. The improvement of this result became possible only in 2019 with the large-scale KATRIN experiment in Germany that was created in participation with INR RAS. In fact, the Troitsk nu-mass experiment was considered as a prototype for KATRIN.

Experimental data have been obtained on nuclear reactions with the participation of protons and neutrons of medium energies along with data on photonuclear reactions, including the study of the spin structure of a proton using an active polarised target. New effects in collisions of relativistic nuclei have been observed and a new scientific direction has been started, “nuclear photonics”. Two effects in astroparticle physics have been named after scientists from INR RAS: the “GZK cut-off”, which is high-energy cut-off in the spectrum of the ultrahigh-energy cosmic rays named after Kenneth Greisen (US), Georgy Zatsepin and Vadim Kuzmin (INR RAS); and the “Mikheyev–Smirnov–Wolfenstein effect” concerning neutrino oscillations in matter, named after Stanislav Mikheyev, Alexei Smirnov (INR RAS) and Lincoln Wolfenstein (US).

Theoretical studies at INR RAS are also widely known, including the development of perturbation theory methods, study of the ground state (vacuum) in gauge theories, methods for studying the dynamics of strong interactions of hadrons outside the framework of perturbation theory, the first ever brane-world models and the development of principles and the search for mechanisms for the formation of the baryon asymmetry of the universe.

There are plans to construct a large centre for nuclear medicine at the base of the linear accelerator centre

Scientists from INR RAS take an active part in the work of a number of large international experiments at CERN, JINR, Germany, Japan, Italy, USA, China, France, Spain and other countries. The Institute also conducts educational activities, having its own graduate school and teaching departments in nearby institutes such as the Moscow Institute for Physics and Technology.

The future of INR RAS is deeply rooted in its new large-scale infrastructures. Baikal-GVD will, along with the IceCube experiment at the South Pole, be able to register neutrinos of astrophysical origin in the hope of establishing their nature. A project has been prepared to modernise the linear proton accelerator in Troitsk using superconducting radio-frequency cavities, while there are also plans to construct a large centre for nuclear medicine at the base of the linear accelerator centre. There is a proposal to build the Baksan Large-Volume Scintillator Detector at BNO containing 10 ktons of ultra-pure liquid scintillator, which would be able to register neutrinos from the carbon–nitrogen–oxygen (CNO) fusion cycle in the Sun with a precision sufficient to discriminate between various solar models.

The past 50 years have seen consistent growth at INR RAS, and with world-leading future projects on the horizon, the Institute has no signs of slowing down.

The post Russia’s particle-physics powerhouse appeared first on CERN Courier.

]]>
Feature The Institute for Nuclear Research in Moscow celebrates its 50th anniversary. https://cerncourier.com/wp-content/uploads/2020/12/2-монтаж.jpg
The Mirror Trap https://cerncourier.com/a/the-mirror-trap/ Tue, 15 Dec 2020 23:43:11 +0000 https://preview-courier.web.cern.ch/?p=90357 Simon Watt's performance gives the audience a chance to experiment with the psychology of self-identity and explore the interpretations of quantum mechanics, writes our reviewer.

The post The Mirror Trap appeared first on CERN Courier.

]]>
The Mirror Trap

A quantum physicist has mysteriously disappeared, leaving behind two mirrors, a strange machine, hallucinogenic drugs and a diary filled with ramblings and Feynman diagrams. His last thoughts reveal his views on the many-worlds interpretation – the controversial idea that there are as many worlds as there are possible outcomes in quantum measurements.

The Mirror Trap is an online performance where the audience has the chance to experiment with the psychology of self-identity and explore the interpretations of quantum mechanics. The public is asked to draw Feynman diagrams on a mirror, plonk themselves down in front of it and listen to the play using headphones, thereby transforming a dimly lit room into a private theatrical space.

The experience is hypnotic, eerie and introspective. Ideas at the intersection between physics and psychology are described in a beautifully written monologue. The protagonist believes that he has devised a new way to access a parallel universe and replicate Schrödinger’s thought experiment; however, he must play the role of the cat, and be observed. Under severe emotional pressure, he begs the audience to witness his desperate attempt to reach a universe where he did not make the biggest mistake of his life.

Visual and auditory illusions play tricks with the participants’ brains

While the physicist is digging deep into his psyche and preparing for a leap into the unknown, visual and auditory illusions play tricks with the participants’ brains. From Snow White to Alice Through the Looking-Glass, mirrors have been linked to mysterious portals, superstition and fairy tales. In this play, they are portals to other worlds, and also tools to reflect about life, self and perception. Many people feel subjective sensations of otherness and report dissociative identity effects when looking at themselves in a mirror. This strange-face-in-the-mirror illusion is more pronounced in dim light and is associated with Troxler’s fading and neural adaptation: when we look at an unchanging image some features disappear temporarily from our perception and our brain fills this missing information with other elements. This effect is particularly spooky when applied to one’s own face.

The performance was written, created and played by biologist and science communicator Simon Watt, with assistance from playwright Alexandra Wood. The 20-minute piece was followed by a discussion and question-and-answer session with Watt, psychologist Julia Shaw, and physicist Harry Cliff of LHCb and the University of Cambridge, who was scientific consultant for this work and guest physicist at the Bloomsbury Festival, under the auspices of which the piece was performed. Watt is now looking for other researchers and festivals interested in collaborating.

As arts and science festivals have moved online because of Covid-19 restrictions, this show found a creative way to engage the public while sitting at home. A well-thought-out merging of drama and science engagement, The Mirror Trap is an intense and intriguing experience for physicists and non-physicists alike.

The post The Mirror Trap appeared first on CERN Courier.

]]>
Review Simon Watt's performance gives the audience a chance to experiment with the psychology of self-identity and explore the interpretations of quantum mechanics, writes our reviewer. https://cerncourier.com/wp-content/uploads/2020/12/Mirror-Trap-1000.jpg
To Russia with love https://cerncourier.com/a/to-russia-with-love/ Tue, 15 Dec 2020 21:48:15 +0000 https://preview-courier.web.cern.ch/?p=90341 Frank Close's new book on nuclear spy Klaus Fuchs offers a poignant insight into a formative time for the field, writes our reviewer.

The post To Russia with love appeared first on CERN Courier.

]]>
“Why do you give all those secrets to the Russians?” So teases an inebriated Mary Bunemann, confidante to the leading nuclear physicists at the UK’s Atomic Energy Research Establishment, at the emotional climax of Frank Close’s new book Trinity: The Treachery and Pursuit of the Most Dangerous Spy in History. The scene is a party on New Year’s Eve in 1949, in the cloistered laboratory at Harwell, in the Berkshire countryside. With her voice audible across a room populated by his close colleagues and friends, Bunemann unwittingly confronted theoretical physicist Klaus Fuchs with the truth of his double life. As Close’s text suspensefully unfolds, the biggest brain working on Britain’s effort to build a nuclear arsenal had been faced with the very same allegation by an MI5 interrogator just 10 days earlier.

Close’s story expands dramatically in scope when Peierls and Fuchs are recruited to the Manhattan Project

Klaus Fuchs began working on nuclear weapons in 1941, when he was recruited by Rudolf Peierls – the “midwife to the atomic age”, in Close’s estimation. Both men were refugees from Nazi Germany. A few years older, and better established in Britain, Peierls would become a friend and mentor to Fuchs. A quarter of a century later, Peierls would also establish a relationship with a young Frank Close, when he arrived at Oxford’s theoretical physics department. Close has now been able to make a poignant contribution to the literature of the bomb by sharing the witness of his connection to the Peierls family, who felt Fuchs’ betrayal bitterly, and were personally affected by the suspicion engendered by his espionage.

Close’s story expands dramatically in scope when Peierls and Fuchs are recruited to the Manhattan Project. Though Peierls was among the first to glimpse the power of atomic weapons, Fuchs began to exceed him in significance to the project during this period. In one of the strongest portions of the book, Close balances physics, politics and the intrigue of shady meetings with Fuchs’ handlers at a time when he passed to the Soviet Union a complete set of instructions for building the first stage of a uranium bomb, a full description of the plutonium bomb used in the Trinity test in the New Mexico desert, and detailed notes on Enrico Fermi’s lectures on the hydrogen bomb.

Intensely claustrophobic

The story becomes intensely claustrophobic when Fuchs returns to England to head the theoretical physics department at Harwell. Here, Close evokes the contradictions in Fuchs’ character: his conviction that nuclear knowledge should be shared between great powers to avert war; his principled but tested faith in communism, awakened while protesting the rise of Nazism; his devoted pastoral care for members of his inner circle at Harwell, even as the net closed around him; and his willingness to share not only nuclear secrets but also the bed of his colleague’s wife. Close has a particular obsession with the question of whether Fuchs’ eventual confession was induced by unrealistic suggestions that he could be forgiven and continue his work. But inducement did not jeopardise Fuchs’ ultimate conviction and imprisonment, despite MI5’s fears, and Close judges his 14-year sentence, later reduced, to be just. Even here, however, the Soviets had the last laugh, with Fuchs’ apprehension not only depriving the British nuclear programme of its greatest intellectual asset, but also precipitating the defection of Bruno Pontecorvo.

Trinity book cover

Close chose an ideal moment to research his history, writing with the benefit of newly released MI5 records, and before several others were withdrawn without notice. He applies forensic attention to the agency’s pursuit of the nuclear spy. Occasionally, however, this is to the detriment of the reader, with events seemingly diffracted onto the pages – both prefigured and returned to as the story progresses and new evidence comes to light. We step through time in Fuchs’ shoes, for example only learning at the end of the book that two other spies at the Manhattan Project were also passing information to the Russians. While Close’s inclination to let the evidence speak for itself is surely the mark of a good physicist, readers in search of a more analytical history may wish to also consult Mike Rossiter’s 2014 biography The Spy Who Changed the World: Klaus Fuchs and the secrets of the nuclear bomb, which offers a more rounded presentation of the Russian and American perspectives.

By bringing physics expertise, personal connections and impressive attention to detail to bear, Frank Close’s latest book has much to offer readers seeking insights into a formative time for the field, when the most talented minds in nuclear physics also bore the weight of world politics on their shoulders. He eloquently tells the tragedy of “the most dangerous spy in history”, as it played out between the trinity of Fuchs, his mentor Peierls and a shadowy network of spooks. Above all, the text is an intimate portrait of the inner struggles of a principled man who betrayed his adopted homeland, even as he grew to love it, and by doing so helped to shape the latter half of the 20th century.

The post To Russia with love appeared first on CERN Courier.

]]>
Review Frank Close's new book on nuclear spy Klaus Fuchs offers a poignant insight into a formative time for the field, writes our reviewer. https://cerncourier.com/wp-content/uploads/2020/12/Fuchs-1000.jpg
A unique period for computing, but will it last? https://cerncourier.com/a/a-unique-period-for-computing-but-will-it-last/ Wed, 18 Nov 2020 09:35:20 +0000 https://preview-courier.web.cern.ch/?p=89989 The computing demands expected this decade puts HEP in a similar position to 1995 when the field moved to PCs, argues Sverre Jarp.

The post A unique period for computing, but will it last? appeared first on CERN Courier.

]]>
Monica Marinucci and Ivan Deloose

Twenty-five years ago in Rio de Janeiro, at the 8th International Conference on Computing in High-Energy and Nuclear Physics (CHEP-95), I presented a paper on behalf of my research team titled “The PC as Physics Computer for LHC”. We highlighted impressive improvements in price and performance compared to other solutions on offer. In the years that followed, the community started moving to PCs in a massive way, and today the PC remains unchallenged as the workhorse for high-energy physics (HEP) computing.

HEP-computing demands have always been greater than the available capacity. However, our community does not have the financial clout to dictate the way computing should evolve, demanding constant innovation and research in computing and IT to maintain progress. A few years before CHEP-95, RISC workstations and servers had started complementing the mainframes that had been acquired at high cost at the start-up of LEP in 1989. We thought we could do even better than RISC. The increased-energy LEP2 phase needed lots of simulation, and the same needs were already manifest for the LHC. These were our inspirations that led PC servers to start populating our computer centres – a move that was also helped by a fair amount of luck.

Fast change

HEP programs need good floating-point compute capabilities and early generations of the Intel x86 processors, such as the 486/487 chips, offered mediocre capabilities. The Pentium processors that emerged in the mid-1990s changed the scene significantly, and the competitive race between Intel and AMD was a major driver of continued hardware innovation.

Another strong tailwind came from the relentless efforts to shrink transistor sizes in line with Moore’s law, which saw processor speeds increase from 50/100 MHz to 2000/3000 MHz in little more than a decade. After 2006, when speed increases became impossible for thermal reasons, efforts moved to producing multi-core chips. However, HEP continued to profit. Since all physics events at colliders such as the LHC are independent of all others, it was sufficient to split a job into multiple jobs across all cores.

Sverre Jarp

The HEP community was also lucky with software. Back in 1995 we had chosen Windows/NT as the operating system, mainly because it supported multiprocessing, which significantly enhanced our price/performance. Physicists, however, insisted on Unix. In 1991, Linus Thorvalds released Linux version 0.01 and it quickly gathered momentum as a worldwide open-source project. When release 2.0 appeared in 1996, multiprocessing support was included and the operating system was quickly adopted by our community.

Furthermore, HEP adopted the Grid concept to cope with the demands of the LHC. Thanks to projects such as Enabling Grids for E-science, we built the Worldwide LHC Computing Grid, which today handles more than two million tasks across one million PC cores every 24 hours. Although grid computing remained mainly amongst scientific users, the analogous concept of cloud computing had the same cementing effect across industry. Today, all the major cloud-computing providers overwhelmingly rely on PC servers.

In 1995 we had seen a glimmer, but we had no idea that the PC would remain an uncontested winner during a quarter of a century of scientific computing. The question is whether it will last for another quarter century?

The contenders

The end of CPU scaling, argued a recent report by the HEP Software Foundation, demands radical changes in computing and software to ensure the success of the LHC and other experiments into the 2020s and beyond. There are many contenders that would like to replace the x86 PC architecture. It could be graphics processors, where both Intel, AMD and Nvidia are active. A wilder guess is quantum computing, whereas a more conservative guess would be processors similar to the x86, but based on other architectures, such as ARM or RISC-V.

The end of CPU scaling demands radical changes to ensure the success of the LHC and other high-energy physics experiments

During the PC project we collaborated with Hewlett-Packard, which had a division in Grenoble, not too far away. Such R&D collaborations have been vital to CERN and the community since the beginning and they remain so today. They allow us to get insight into forthcoming products and future plans, while our feedback can help to influence the products in plan. CERN openlab, which has been the focal point for such collaborations for two decades, early-on coined the phrase “You make it, we break it”. However, whatever the future holds, it is fair to assume that PCs will remain the workhorse for HEP computing for many years to come.

The post A unique period for computing, but will it last? appeared first on CERN Courier.

]]>
Opinion The computing demands expected this decade puts HEP in a similar position to 1995 when the field moved to PCs, argues Sverre Jarp. https://cerncourier.com/wp-content/uploads/2020/11/CCNovDec20_VIEW_stack.jpg
How to find a Higgs boson https://cerncourier.com/a/how-to-find-a-higgs-boson/ Thu, 12 Nov 2020 10:50:37 +0000 https://preview-courier.web.cern.ch/?p=89971 Ivo van Vulpen’s popular book isn’t an airy pamphlet cashing in on the 2012 discovery, but a realistic representation of what it’s like to be a particle physicist.

The post How to find a Higgs boson appeared first on CERN Courier.

]]>
How to Find a Higgs Boson

Finding Higgs bosons can seem esoteric to the uninitiated. The spouse of a colleague of mine has such trouble describing what their partner does that they read from a card in the event that they are questioned on the subject. Do you experience similar difficulties in describing what you do to loved ones? If so, then Ivo van Vulpen’s book How to find a Higgs boson may provide you with an ideal gift opportunity.

Readers will feel like they are talking physics over a drink with van Vulpen, who is a lecturer at the University of Amsterdam and a member of the ATLAS collaboration. Originally published as De melodie van de natuur, the book’s Dutch origins are unmistakable. We read about Hans Lippershey’s lenses, Antonie van Leeuwenhoeck’s microbiology, Antonius van den Broek’s association of charge with the number of electrons in an atom, and even Erik Verlinde’s theory of gravity as an emergent entropic force. Though the Higgs is dangled at the end of chapters as a carrot to get the reader to keep reading, van Vulpen’s text isn’t an airy pamphlet cashing in on the 2012 discovery, but a realistic representation of what it’s like to be a particle physicist. When he counsels budding scientists to equip themselves better than the North Pole explorer who sets out with a Hugo Boss suit, a cheese slicer and a bicycle, he tells us as much about himself as about what it’s like to be a physicist.

Van Vulpen is a truth teller who isn’t afraid to dent the romantic image of serene progress orchestrated by a parade of geniuses. 9999 out of every 10,000 predictions from “formula whisperers” (theorists) turn out to be complete hogwash, he writes, in the English translation by David McKay. Sociological realities such as “mixed CMS–ATLAS” couples temper the physics, which is unabashedly challenging and unvarnished. The book boasts a particularly lucid and intelligible description of particle detectors for the general reader, and has a nice focus on applications. Particle accelerators are discussed in relation to the “colour X-rays” of the Medipix project. Spin in the context of MRI. Radioactivity with reference to locating blocked arteries. Antimatter in the context of PET scans. Key ideas are brought to life in cartoons by Serena Oggero, formerly of the LHCb collaboration.

The weak interaction is like a dog on an attometre-long chain.

Attentive readers will occasionally be frustrated. For example, despite a stated aim of the book being to fight “formulaphobia”, Bohr’s famous recipe for energy levels lacks the crucial minus sign just a few lines before a listing of –3.6 eV (as opposed to –13.6 eV) for the energy of the ground state. Van Vulpen compares the beauty seen by physicists in equations to the beauty glimpsed by musicians as they read sheet music, but then prints Einstein’s field equations with half the tensor indices missing. But to quibble about typos in the English translation would be to miss the point of the book, which is to allow readers “to impress friends over a drink,” and talk physics “next time you’re in a bar”. Van Vulpen’s writing is always entertaining, but never condescending. Filled with amusing but perceptive one-liners, the book is perfectly calibrated for readers who don’t usually enjoy science. Life in a civilisation that evolved before supernovas would have no cutlery, he observes. Neutrinos are the David Bowie of particles. The weak interaction is like a dog on an attometre-long chain.

This book could be the perfect gift for a curious spouse. But beware: fielding questions on the excellent last chapter, which takes in supersymmetry, SO(10), and millimetre-scale extra dimensions, may require some revision.

The post How to find a Higgs boson appeared first on CERN Courier.

]]>
Review Ivo van Vulpen’s popular book isn’t an airy pamphlet cashing in on the 2012 discovery, but a realistic representation of what it’s like to be a particle physicist. https://cerncourier.com/wp-content/uploads/2020/11/CCNovDec20_REV_Ivo_feature.jpg
Particles mean prizes https://cerncourier.com/a/particles-mean-prizes/ Tue, 15 Sep 2020 15:01:40 +0000 https://preview-courier.web.cern.ch/?p=88372 Just five research areas account for more than half of Nobel prizes.

The post Particles mean prizes appeared first on CERN Courier.

]]>
Just five research areas account for more than half of Nobel prizes, even though they publish only 10% of papers, reveals a study by social scientists John Ioannidis, Ioana-Alina Cristea and Kevin Boyack. The trio mapped the number of Nobel prizes in medicine, physics and chemistry between 1995 and 2017 to 114 fields of science, finding that particle physics came top with 14%, followed by cell biology (12%), atomic physics (11%), neuroscience (10%) and molecular chemistry (5%). The analysts also investigated whether Nobel success reflects immediate scientific impact, and found that the only key paper associated with a Nobel Prize which was the most cited that year pertains to the 2010 award to Andre Geim and Konstantin Novoselov for experiments with graphene. On average, more than 400 papers had greater impact than the work most closely associated with the prize-winners’ success within a year either side of the publication dates.

Particle-physics prize-winners in the period studied include: Perl and Reines (1995) for the discovery of the tau lepton and the detection of the neutrino; ’t Hooft and Veltman (1999) for contributions to electroweak theory; Davis and Koshiba (2002) for the detection of cosmic neutrinos; Gross, Politzer and Wilczek (2004) for asymptotic freedom; Nambu, Kobayashi and Maskawa (2008) for work on spontaneous symmetry breaking and quark mixing; Englert and Higgs (2013) for the Brout–Englert–Higgs mechanism; and Kajita and McDonald (2015) for the discovery of neutrino oscillations. The team also chose to class Mather and Smoot’s 2006 prize relating to the cosmic microwave background, Perlmutter, Schmidt and Riess’s 2011 award for the discovery of the accelerating expansion of the universe, and Weiss, Barish and Thorne’s 2017 gong for the observation of gravitational waves as particle-physics research.

The winners of this year’s Nobel prize in physics will be announced on Tuesday 6 October.

The post Particles mean prizes appeared first on CERN Courier.

]]>
News Just five research areas account for more than half of Nobel prizes. https://cerncourier.com/wp-content/uploads/2020/09/journal.pone_.0234612.g001-1000.jpg
New Perspectives on Einstein’s E = mc2 https://cerncourier.com/a/new-perspectives-on-einsteins-e%e2%80%89%e2%80%89mc2/ Tue, 07 Jul 2020 09:12:51 +0000 https://preview-courier.web.cern.ch/?p=87753 Young Suh Kim and Marilyn Noz’s book may struggle to find its audience, says Nikolaos Rompotis.

The post New Perspectives on Einstein’s E = mc<sup>2</sup> appeared first on CERN Courier.

]]>

New Perspectives on Einstein’s E = mc2 mixes historical notes with theoretical aspects of the Lorentz group that impact relativity and quantum mechanics. The title is a little perplexing, however, as one can hardly expect nowadays to discover new perspectives on an equation such as E = mc2. The book’s true aim is to convey to a broader audience the formal work done by the authors on group theory. Therefore, a better-suited title may have been “Group theoretical perspectives on relativity”, or even, more poetically, “When Wigner met Einstein”.

The first third of the book is an essay on Einstein’s life, with historical notes on topics discussed in the subsequent chapters, which are more mathematical and draw heavily on publications by the authors – a well-established writing team who have co-authored many papers relating to group theory. The initial part is easy to read and includes entertaining stories, such as Einstein’s mistakes when filing his US tax declaration. Einstein, according to this story, was calculating his taxes erroneously, but the US taxpayer agency was kind enough not to raise the issue. The reader has to be warned, however, that the authors, professors at the University of Maryland and New York University, have a tendency to make questionable statements about certain aspects of the development of physics that may not be backed up by the relevant literature, and may even contradict known facts. They have a repeated tendency to interpret the development of physical theories in terms of a Hegelian synthesis of a thesis and an antithesis, without any cited sources in support, which seems, in most cases, to be a somewhat arbitrary a posteriori assessment.

There is a sharp distinction in the style of the second part of the book, which requires training in physics or maths at advanced undergraduate level. These chapters begin with a discussion of the Lorentz group. The interest then quickly shifts to Wigner’s “little groups”, which are subgroups of the Lorentz group with the property of leaving the momentum of a system invariant. Armed with this mathematical machinery, the authors proceed to Dirac spinors and give a Lorentz-invariant formulation of the harmonic oscillator that is eventually applied to the parton model. The last chapter is devoted to a short discussion on optical applications of the concepts advanced previously. Unfortunately, the book finishes abruptly at this point, without a much-needed final chapter to summarise the material and discuss future work, which, the previous chapters imply, should be plentiful.

Young Suh Kim and Marilyn Noz’s book may struggle to find its audience. The contrast between the lay and expert parts of this short book, and the very specialised topics it explores, do not make it suitable for a university course, though sections could be incorporated as additional material. It may well serve, however, as an interesting pastime for mathematically inclined audiences who will certainly appreciate the formalism and clarity of the presentation of the mathematics.

The post New Perspectives on Einstein’s E = mc<sup>2</sup> appeared first on CERN Courier.

]]>
Review Young Suh Kim and Marilyn Noz’s book may struggle to find its audience, says Nikolaos Rompotis. https://cerncourier.com/wp-content/uploads/2020/07/CCJulAug20_Rev_einstein_feature.jpg
The Human Condition: Reality, Science and History https://cerncourier.com/a/the-human-condition-reality-science-and-history/ Wed, 01 Apr 2020 08:08:56 +0000 https://preview-courier.web.cern.ch/?p=86689 Renowned accelerator physicist Gregory Loew has written an insightful book of truly ambitious scope, writes our reviewer.

The post The Human Condition: Reality, Science and History appeared first on CERN Courier.

]]>
The Human Condition: Reality, Science and History

“Homo has much work left to become Sapiens,” is Gregory Loew’s catchphrase in The Human Condition: Reality, Science and History. An accelerator physicist with an illustrious 50-year-long career at the SLAC National Accelerator Laboratory in California, Loew also taught a seminar at Stanford University that ran the gamut from psychology and anthropology to international relations and arms control. His new book combines these passions.

This reviewer must admit to being inspired by the breadth of Loew’s poly­mathic ambition, which he has condensed into 200 colourful pages. The author compares his work to noted Israeli historian Yuval Harari’s hefty tomes Sapiens and Homo Deus, but The Human Condition is more idiosyncratic, and peppered with fascinating titbits. He points out the difficulties in connecting free will with quantum indeterminacy. He asks what came first: the electron or the electric field? Neglecting to mention the disagreement with the long-accepted age of the universe inferred from fits to the cosmic microwave background, he breathlessly slips in a revised-down value of 12.8 billion years, tacitly accepting the 2019 measurement of the Hubble constant based on observations by the Hubble Space Telescope. He even digresses momently to note the almost unique rhythmic awareness of cockatoo parrots.

But this is not a scenic drive through the nature of existence. Loew wants to be complete. He reverses from epistemology to evolution and the nature of perception, before pulling out onto the open road of mathematics and the sciences, both fundamental and social, via epigenetics, Thucydides and the Cuban missile crisis. The final chapter, which looks to the future, is really a thoughtful critique of Harari’s books, which he discovered while writing. It’s heartening to join Loew on an expansive road trip from metaphysics and physics to economic theory and realpolitik.

No scientific knowledge or mathematical training is necessary to enjoy The Human Condition, which will entertain and intrigue physicists and lay audiences alike. While some subjects, such as homosexuality, are treated with inappropriate swiftness, in that case with a rapid and highly questionable hop from Freud to Kinsey to Schopenhauer to Pope Francis, in general Loew writes with a refreshing élan. His final thought is that “if all Homo Sapientes became wiser, they would certainly be happier.” Here, he flirts with contradicting Kant, a philosopher he frequently esteems, who wrote that the cultivation of reason sooner leads to misery than happiness. But perhaps the key word is “all Homo Sapientes”. If every one of us became wiser, perhaps through the utopic initiatives advocated by Loew, we would indeed be happier.

The post The Human Condition: Reality, Science and History appeared first on CERN Courier.

]]>
Review Renowned accelerator physicist Gregory Loew has written an insightful book of truly ambitious scope, writes our reviewer. https://cerncourier.com/wp-content/uploads/2020/03/CCMarApr20_Reviews_loew_feature.jpg
Fiction, in theory https://cerncourier.com/a/fiction-in-theory/ Tue, 31 Mar 2020 18:22:51 +0000 https://preview-courier.web.cern.ch/?p=87038 French actor Irène Jacob's novel is an intimate portrait of life as the daughter of a renowned theoretical physicist, writes James Gillies.

The post Fiction, in theory appeared first on CERN Courier.

]]>

French actor Irène Jacob rose to international acclaim for her role in the 1991 film The Double Life of Véronique. She is the daughter of Maurice Jacob (1933 – 2007), a French theoretical physicist and Head of CERN’s Theory Division from 1982 to 1988. Her new novel, Big Bang, is a fictionalised account of the daughter of a renowned physicist coming to terms with the death of her father and the arrival of her second child. Keen to demonstrate the artistic beauty of science, she is also a Patron of the Physics of the Universe Endowment Fund established in Paris by George Smoot.

When Irène Jacob recites from her book, it is more than a reading, it’s a performance. That much is not surprising: she is after all the much-feted actor in the subtly reflective 1990s films of Krzysztof Kieślowski. What did come as a surprise to this reader is just how beautifully she writes. With an easy grace and fluidity, she weaves together threads of her life, of life in general, and of the vast mysteries of the universe.

The backdrop to the opening scenes is the corridors of the theory division in the 70s and 80s

Billed as a novel, Big Bang comes across more as a memoir, and that’s no accident. The author’s aim was to use her entourage, somewhat disguised, to tell a universal story of the human condition. Names are changed, Irène’s father, the physicist Maurice Jacob, becomes René, for example, his second name. The true chronology of events is not strictly observed, and maybe there’s some invention, but behind the storytelling there is nevertheless a touching portrait of a very real family. The backdrop to the opening scenes is CERN, more specifically the corridors of the theory division in the 70s and 80s, a regular stomping ground for the young Irène. The reader discovers the wonders of physics through the wide-open eyes of a seven-year-old child. Later on, that child-become-adult reflects on other wonders – those related to the circle of life. The book ties all this together, seen from the point in spacetime at which Irène has to reconcile her father’s passing with her own impending motherhood.

For those who remember the CERN of the 80s, the story begins with an opportunity to rediscover old friends and places. For those not familiar with particle physics, it offers a glimpse into the field, to those who devote their lives to it, and to those who share their lives with them. The initial chapters open the door to Irène Jacob’s world, just a crack.

The atmosphere soon changes, though, as she flings the door wide open. More than once I found myself wondering whether I had the right to be there: inside Irène Jacob’s life, dreams and nightmares. It is a remarkably intimate account, looking deep in to what it is to be human. Highs and lows, loves and laughs, kindnesses and hurts, even tragedies: all play a part. Irène Jacob’s fictionalised family suffers much, yet although Irène holds nothing back, Big Bang is essentially an optimistic, life affirming tale.

Science makes repeated cameo appearances. There’s a passage in which René is driving home from hospital after welcoming his first child into the world. Distracted by emotion, he’s struck by a great insight and has to pull over and tell someone. How often does that happen in the creative process? Kary Mullis tells a similar story in his memoirs. In his case, the idea for Polymerase Chain Reaction came to him at the end of hot May day on Highway 128 with his girlfriend asleep next to him in the passenger seat of his little silver Honda. Mullis got the Nobel Prize. Both had a profound impact on their fields.

Bohr can be paraphrased as saying: the opposite of a profound truth is another profound truth

Alice in Wonderland is a charmingly recurrent theme, particularly the Cheshire cat. Very often, a passage ends with nothing left but an enigmatic smile, a metaphor for life in the quantum world, where believing in six impossible things before breakfast is almost a prerequisite.

Big Bang is not a page turner. Instead, each chapter is a beautifully formed vignette of family life. Take, for example, the passage that begins with a quote from Niels Bohr taken René’s manuscript, Des Quarks et des Hommes (published as Au Coeur de la Matière). Bohr can be paraphrased as saying: the opposite of a profound truth is another profound truth. As the passage moves on, it plays with this theme, ending with the conclusion: if my story does not stand up, it’s because reality is very small. And if my story is very small, it is because reality does not stand up.

Whatever the author’s wish, Big Bang comes across as an admirably honest family portrait, at times uncomfortably so. It’s a portrait that goes much deeper than the silver screen or the hallowed halls of academia. The cast of Big Bang is a very human family, and one that this reader came to like very much.

The post Fiction, in theory appeared first on CERN Courier.

]]>
Review French actor Irène Jacob's novel is an intimate portrait of life as the daughter of a renowned theoretical physicist, writes James Gillies. https://cerncourier.com/wp-content/uploads/2020/03/Irene-Jacob-CERN.jpg
Ascent commemorates cosmic-ray pioneers https://cerncourier.com/a/ascent-commemorates-cosmic-ray-pioneers/ Thu, 26 Mar 2020 08:09:00 +0000 https://preview-courier.web.cern.ch/?p=86667 Particle physicists brought cosmic-ray science to the heart of the Château-d’Oex International Balloon Festival.

The post Ascent commemorates cosmic-ray pioneers appeared first on CERN Courier.

]]>
A hot-air balloon commemorating the discovery of cosmic rays

On 25 January, a muon detector, a particle physicist and a prizewinning pilot ascended 4000 m above the Swiss countryside in a hot-air balloon to commemorate the discovery of cosmic rays. The event was the highlight of the opening ceremony of the 42nd Château-d’Oex International Balloon Festival, attended by an estimated 30,000 people, and attracted significant media coverage.

In the early 1900s, following Becquerel’s discovery of radioactivity, studying radiation was all the rage. Portable electrometers were used to measure the ionisation of air in a variety of terrestrial environments, from fields and lakes to caves and mountains. With the idea that ionisation should decrease with altitude, pioneers adventured in balloon flights as early as 1909 to count the number of ions per cm3 of air as a function of altitude. First results indeed indicated a decrease up to 1300 m, but a subsequent ascent to 4500 m by Albert Gockel, professor of physics at Fribourg, concluded that ionisation does not decrease and possibly increases with altitude. Gockel, however, who later would coin the term “cosmic radiation”, was unable to obtain the hydrogen needed to go to higher altitudes. And so it fell to Austrian physicist Victor Hess to settle the case. Ascending to 5300 m in 1912, Hess clearly identified an increase, and went on to share the 1936 Nobel Prize in Physics for the discovery of cosmic rays. Gockel, who died in 1927, could not be awarded, and for that reason is almost forgotten by history.

ATLAS experimentalist Hans Peter Beck of the University of Bern, and a visiting professor at the University of Fribourg, along with two students from the University of Fribourg, reenacted Gockel’s and Hess’s pioneering flights using 21st-century technology: a muon telescope called the Cosmic Hunter, newly developed by instrumentation firm CAEN. The educational device, which counts coincidences in two scintillating-fibre tiles of 15 × 15 cm2 separated by 15 cm, verified that the flux of cosmic rays increases as a function of altitude. Within two hours of landing, including a one-hour drive back to the starting point, Beck was able to present the data plots during a public talk attended by more than 250 people. A second flight up to 6000 m is planned, with oxygen supplies for passengers, when weather conditions permit.

The view from inside the hot-air balloon

“Relating balloons with particle physics was an easy task, given the role balloons played in the early days for the discovery of cosmic rays,” says Beck. “It is a narrative that works and that touches people enormously, as the many reactions at the festival have shown.”

The event – a collaboration with the universities of Bern and Fribourg, the Swiss Physical Society, and the Jungfraujoch research station – ran in parallel to a special exhibition about cosmic rays at the local balloon museum, organised by Beck and Michael Hoch from CMS, which was the inspiration for festival organisers to make physics a focus of the event, says Beck: “Without this, the festival would never have had the idea to bring ‘adventure, science and freedom’ as this year’s theme. It’s really exceptional.”

The post Ascent commemorates cosmic-ray pioneers appeared first on CERN Courier.

]]>
News Particle physicists brought cosmic-ray science to the heart of the Château-d’Oex International Balloon Festival. https://cerncourier.com/wp-content/uploads/2020/03/CCMarApr20_NewsAnalysis_Balloon_feature.jpg
Rolf Widerøe: a giant in the history of accelerators https://cerncourier.com/a/rolf-wideroe-a-giant-in-the-history-of-accelerators/ Mon, 23 Mar 2020 19:17:18 +0000 https://preview-courier.web.cern.ch/?p=86886 Aashild Sørheim's book presents new documentary evidence on the wartime life of an engineer who had a seminal impact on accelerator physics, writes Kurt Hübner.

The post Rolf Widerøe: a giant in the history of accelerators appeared first on CERN Courier.

]]>
The betatron is an early type of MeV-range electron accelerator which uses the electric field induced by a varying magnetic field to accelerate electrons, or beta particles. It operates like a transformer with the secondary winding replaced by a beam of electrons circulating in a vacuum tube. It was invented by pioneering Norwegian accelerator physicist Rolf Widerøe when a student in 1925. Since the construction failed at the time, he had to find another theme for his thesis, and so in 1927 he constructed the first linear accelerator (50 keV), before later proposing the principle of colliding beams to fully exploit the energy of accelerated particles. Through these innovations, Rolf Widerøe decisively influenced the course of high-energy physics, with betatrons shaping the landscape in the early days, and linear accelerators and colliding beams becoming indispensable tools today.

Obsessed by a Dream: The Physicist Rolf Widerøe – A Giant in the History of Accelerators, by Aashild Sørheim

Aashild Sørheim, a professional writer, now presents a new biography of this visionary engineer, who had a seminal impact on accelerator physics. Her book covers Widerøe’s whole life, from 1902 to 1996, and from his childhood in a well-to-do family in Oslo to his retirement in Switzerland. Certainly, many who read Pedro Waloscheck’s 1994 biography, The Infancy of Particle Accelerators: Life and Work of Rolf Widerøe, will be curious how this new book will complement the former. Sørheim‘s new offering is based on new documentary evidence, the result of painstaking sifting through archives, and a large number of interviews. She has opened new perspectives through her interviews, and the access she has gained in several countries to hitherto restricted archives has provided a wealth of new material and insights, in particular in relation to the second world war. Sørheim’s book focuses not on physics or technology, but on Widerøe himself, and the social and political environment in which he had to find his way. In particular, it gravitates to the question of his motivation to work in Germany in the troubled years from 1943 to 1945, when he constructed a betatron, the accelerator he had invented two decades earlier while a student in Karlsruhe.

Occupied Oslo

In the most interesting parts, the book provides background information about the entanglement of science, industrial interests and armament, and in particular the possible reasons for the “recruitment” of Rolf Widerøe in occupied Oslo in the spring of 1943 by three German physicists mandated by the German air force, who insinuated that willingness to cooperate might well help to improve the conditions of his brother Viggo, who was in prison in Germany for helping Norwegians escape to England. The apparent motivation was that a powerful betatron could produce strong enough X-rays to neutralise allied bomber pilots. Though leading German scientists quickly discovered this to be nonsense, the betatron project was not interrupted. The book describes the difficult working conditions in Hamburg, and the progress towards a 15 MeV betatron. Among the key players was Widerøe’s assistant Bruno Touschek, who was finally arrested by  the Gestapo in 1945 as his mother was Jewish. It was during this time that Widerøe patented his idea to use colliding beams to maximise the energy available, against the advice of Touschek, who found the idea too trivial to publish. It was the Touschek though, who in 1961 used first used this principle in ADA, the e+e ring in Frascati which was the first collider of the world.

Widerøe faced official prosecution on the ludicrous charge of having helped develop V2 rockets

After Widerøe’s return to Oslo in March 1945, when the betatron was operational and the advancing English army made a study of a 200 MeV betatron illusionary, he faced official prosecution on the ludicrous main charge of having helped develop V2 rockets, explains Sørheim. Released from prison after 47 days, he got away without trial, but had to pay a substantial fine. Unemployed, seeing no basis for pursuing his dream of further developing betatrons in his home country, and with the stigma of a collaborator in the understandably overheated atmosphere of the time, he moved his family to Switzerland in 1946. One chapter, strangely put near the beginning of the book, describes how Widerøe then became a successful leader of the betatron production at Brown-Boveri in Switzerland, a respected lecturer at the ETH in Zurich and a promoter of radiation therapy until late into his retirement. He was a CERN consultant in the early days, and worked with Odd Dahl and Frank Goward in Brookhaven 1952 where they became acquainted with the alternating-gradient focusing principle which was then boldly proposed to the CERN Council as basis for the design of the 25 GeV Proton Synchrotron.

The book leaves the reader somehow overwhelmed by the amount of material presented, the non-chronological presentation, and the many repetitions of the same facts, conveying the impression that the author had difficulty in putting the information in a coherent order. However, the many interviews and new documentary evidence, including a hitherto unknown letter from his brother Viggo, open novel perspectives on this extraordinary engineer and scientist who, besides receiving many honours abroad, finally also received recognition in his home country, after a lengthy reconciliation process.

The post Rolf Widerøe: a giant in the history of accelerators appeared first on CERN Courier.

]]>
Review Aashild Sørheim's book presents new documentary evidence on the wartime life of an engineer who had a seminal impact on accelerator physics, writes Kurt Hübner. https://cerncourier.com/wp-content/uploads/2020/03/Wideroe2.jpg
Einstein and Heisenberg: The Controversy over Quantum Physics https://cerncourier.com/a/einstein-and-heisenberg-the-controversy-over-quantum-physics/ Sat, 21 Mar 2020 11:26:28 +0000 https://preview-courier.web.cern.ch/?p=86683 Peter Jenni reviews Konrad Kleinknecht's new book on the interwoven stories of two giants of twentieth century physics.

The post Einstein and Heisenberg: The Controversy over Quantum Physics appeared first on CERN Courier.

]]>
Einstein and Heisenberg: The Controversy over Quantum Physics

This attractive and exciting book gives easy access to the history of the two main pillars of modern physics of the first half of the 20th century: the theory of relativity and quantum mechanics. The history unfolds along the parallel biographies of the two giants in these fields, Albert Einstein and Werner Heisenberg. It is a fascinating read for everybody interested in the science and culture of their time.

At first sight, one could think that the author presents a twin biography of Einstein and Heisenberg, and that’s all. However, one quickly realises that there is much more to this concise and richly illustrated text. Einstein and Heisenberg’s lives are embedded in the context of their time, with emphasis given to explaining the importance and nature of their interactions with the physicists of rank and name around them. The author cites many examples from letters and documents for both within their respective environments, which are most interesting to read, and illustrate well the spirit of the time. Direct interactions between both heroes of the book were quite sparse though.

At several stages throughout the book, the reader will become familiar with the personal life stories of both protagonists, who were, in spite of some commonalities, very different from each other. Common to both, for instance, was their devotion to music and their early interest and outstanding talent in physics as boys at schools in Munich, but on the contrary they were very different in their relations with family and partners, as the author discusses in a lively way. Many of these aspects are well known, but there are also new facets presented. I liked the way this is done, and, in particular, the author does not shy away from also documenting the perhaps less commendable human aspects, but without judgement, leaving the reader to come to their own conclusion.

Topics covering a broad spectrum are commented on in a special chapter called “Social Affinities”. These include religion, music, the importance of family, and, in the case of Einstein, his relation to his wives and women in general, the way he dealt with his immense public reputation as a super scientist, and also his later years when he could be seen as “scientifically an outsider”. In Heisenberg’s case, one is reminded of his very major contributions to the restoration of scientific research in West Germany and Europe after World War II, not least of course in his crucial founding role in the establishment of CERN.

Do not expect a systematic, comprehensive introduction to relativity and quantum physics; this is not a textbook. Its great value is the captivating way the author illustrates how these great minds formed their respective theories in relation to the physics and academic world of their time. The reader learns not only about Einstein and Heisenberg, but also about many of their contemporary colleagues. A central part in this is the controversy about the interpretation of quantum mechanics among Heisenberg’s colleagues and mentors, such as Schrödinger, Bohr, Pauli, Born and Dirac, to name just a few.

Another aspect of overriding importance for the history of that time was of course the political environment spanning the time from before World War I to after World War II. Both life trajectories were influenced in a major way by these external political and societal factors. The author gives an impressive account of all these aspects, and sheds light on how the pair dealt with these terrible constraints, including their attitudes and roles in the development of nuclear weapons.

A special feature of the book, which will make it interesting to everybody, is the inclusion of various hints as to where relativity and quantum mechanics play a direct role in our daily lives today, as well as in topical contemporary research, such as the recently opened field of gravitational-wave astronomy.

This is an ambitious book, which tells the story of the birth of modern physics in a well-documented and well-illustrated way. The author has managed brilliantly to do this in a serious, but nevertheless entertaining, way, which will make the book a pleasant read for all.

The post Einstein and Heisenberg: The Controversy over Quantum Physics appeared first on CERN Courier.

]]>
Review Peter Jenni reviews Konrad Kleinknecht's new book on the interwoven stories of two giants of twentieth century physics. https://cerncourier.com/wp-content/uploads/2020/03/CCMarApr20_Reviews_kleinknecht_feature.jpg
Vladislav Šimák 1934–2019 https://cerncourier.com/a/life-with-antiprotons-and-quarks-vladislav-simak-1934-2019/ Wed, 11 Mar 2020 15:16:43 +0000 https://preview-courier.web.cern.ch/?p=86702 Since the early 1960s his vision and organisational skills helped shape experimental particle physics in Czechoslovakia and Czechia.

The post Vladislav Šimák 1934–2019 appeared first on CERN Courier.

]]>
Vladislav Šimák

Experimental particle physicist and founder of antiproton physics in Czechoslovakia (later the Czech Republic), Vladislav Šimák, passed away on 26 June 2019. Since the early 1960s his vision and organisational skills helped shape experimental particle physics, not only in Prague, but the whole of the country.

After graduating from Charles University in Prague, he joined the group at the Institute of Physics of the Czechoslovak Academy of Sciences studying cosmic rays using emulsion techniques, earning a PhD in 1963. Though it was difficult to travel abroad at that time, Vlada got a scholarship and went to CERN, where he joined the group led by Bernard French investigating collisions of antiprotons using bubble chambers. It was there and then that his lifelong love affair with antiprotons began. He brought back to Prague film material showing the results of collisions of 5.7 GeV antiprotons and protons from a hydrogen bubble chamber, and formed a group of physicists and technicians, involving many diploma and PhD students who processed them. Vlada also fell in love with the idea of quarks as proposed by Gell-Mann and Zweig, and was the first Czech or Slovak physicist to apply a quark model to pion production in proton–antiproton collisions.

In the early 1970s, when contacts with the West were severely limited, Vlada exploited the experiences he accumulated at CERN and put together a group of Czech and Slovak physicists involved in the processing and analysis of data from proton–antiproton collisions, using the then-highest-energy beam of antiprotons (22.4 GeV) and a hydrogen bubble chamber at the Serpukhov accelerator in Russia. This experiment, which in the later stage provided collisions of antideuterons with protons and deuterons, gave many young physicists the chance to work on unique data for their PhDs and earned Vlada respect in the international community.

After the Velvet Revolution he played a pivotal role in accession to CERN membership

In the late 1980s, when the political atmosphere in Czechoslovakia eased, Vlada together with his PhD student joined the UA2 experiment at CERN’s proton–antiproton collider, where he devoted his attention to jet production. After the Velvet Revolution in November 1989 he played a pivotal role in the decision of the Czech and Slovak particle-physics community to focus on accession to CERN membership.

In 1992 Vlada took Czechoslovak particle physicists into the newly formed ATLAS collaboration, and in 1997 he joined the D0 experiment at Fermilab. He was active in ATLAS until very recently, and in 2014, in acknow­ledgment of his services to physics, the Czech Academy of Sciences awarded Vlada the Ernst Mach Medal for his contributions to the development of physics.

Throughout his life he combined his passion for physics with a love for music, for many years playing the violin in the Academy Chamber Orchestra. For many of us Vlada was a mentor, colleague and friend. We all admired his vitality and enthusiasm for physics, which was contagious. Vlada clearly enjoyed life and we very much enjoyed his company.

He will be sorely missed.

The post Vladislav Šimák 1934–2019 appeared first on CERN Courier.

]]>
News Since the early 1960s his vision and organisational skills helped shape experimental particle physics in Czechoslovakia and Czechia. https://cerncourier.com/wp-content/uploads/2020/03/CCMarApr20_Obits-Simak_feature.jpg
50 years of the GIM mechanism https://cerncourier.com/a/50-years-of-the-gim-mechanism/ Fri, 24 Jan 2020 14:34:54 +0000 https://preview-courier.web.cern.ch/?p=86381 A symposium to celebrate the fiftieth anniversary of Glashow, Iliopoulos and Maiani's explanation of the suppression of strangeness-changing neutral currents was held in Shanghai.

The post 50 years of the GIM mechanism appeared first on CERN Courier.

]]>
GIM originators 50 years on

In 1969 many weak amplitudes could be accurately calculated with a model of just three quarks, and Fermi’s constant and the Cabibbo angle to couple them. One exception was the remarkable suppression of strangeness-changing neutral currents. John Iliopoulos, Sheldon Lee Glashow and Luciano Maiani boldly solved the mystery using loop diagrams featuring the recently hypothesised charm quark, making its existence a solid prediction in the process. To celebrate the fiftieth anniversary of their insight, the trio were guests of honour at an international symposium at the T. D. Lee Institute at Shanghai Jiao Tong University on 29 October, 2019.

The UV cutoff needed in the three-quark theory became an estimate of the mass of the fourth quark

The Glashow-Iliopoulos-Maiani (GIM) mechanism was conceived in 1969, submitted to Physical Review D on 5 March 1970, and published on 1 October of that year, after several developments had defined a conceptual framework for electroweak unification. These included Yang-Mills theory, the universal V−A weak interaction, Schwinger’s suggestion of electroweak unification, Glashow’s definition of the electroweak group SU(2)L×U(1)Y, Cabibbo’s theory of semileptonic hadron decays and the formulation of the leptonic electroweak gauge theory by Weinberg and Salam, with spontaneous symmetry breaking induced by the vacuum expectation value of new scalar fields. The GIM mechanism then called for a fourth quark, charm, in addition to the three introduced by Gell-Mann, such that the first two blocks of the electroweak theory are made each by one lepton and one quark doublet, [(νe, e), (u, d)] and [(νµ, µ), (c, s)]. Quarks u and c are coupled by the weak interaction to two superpositions of the quarks d and s: u ↔ dC , with dC the Cabibbo combination dC = cos θC d + sin θC s, and c ↔ sC , with sC the orthogonal combination. In subsequent years, a third generation, [(ντ, τ ), (t, b)] was predicted to describe CP violation. No further generations have been observed yet.

Problem solved

The GIM mechanism was the solution to a problem arising in the simplest weak interaction theory with one charged vector boson coupled to the Cabibbo currents. As pointed out in 1968, strangeness-changing neutral-current processes, such as KL → µ+µ and K0K0 mixing, are generated at one loop with amplitudes of order G sinθC cosθC (GΛ2), where G is the Fermi constant, Λ is an ultraviolet cutoff, and GΛ2 (dimensionless) is the first term in a perturbative expansion which could be continued to take higher order diagrams into account. To comply with the strict limits existing at the time, one had to require a surprisingly small value of the cutoff, Λ, of 2 − 3 GeV, to be compared with the naturally expected value: Λ = G-1/2 ~ 300 GeV. This problem was taken seriously by the GIM authors, who wrote that “it appears necessary to depart from the original phenomenological model of weak interactions”.

GIM mechanism Feynman diagrams

To sidestep this problem, Glashow, Iliopoulos and Maiani brought in the fourth “charm” quark, already introduced by Bjorken, Glashow and others, with its typical coupling to the quark combination left alone in the Cabibbo theory: c ↔ sC = − sinθC d + cosθC s. Amplitudes for s → d with u or c on the same fermion line would cancel exactly for mc = mu, suggesting a more natural means to suppress strangeness-changing neutral-current processes to measured levels. For mc >> mu, a residual neutral-current effect would remain, which, by inspection, and for dimensional reasons, is of order G sinθC cos θC (Gmc2). This was a real surprise: the “small” UV cutoff needed in the simple three-quark theory became an estimate of the mass of the fourth quark, which was indeed sufficiently large to have escaped detection in the unsuccessful searches for charmed mesons that had been conducted in 1960s. With the two quark doublets included, a detailed study of strangeness changing neutral current processes gave mc ∼ 1.5 GeV, a value consistent with more recent data on the masses of charmed mesons and baryons. Another aspect of the GIM cancellation is that the weak charged currents make an SU(2) algebra together with a neutral component that has no strangeness changing terms. Thus, there is no difficulty to include the two quark doublets in the unified electroweak group SU(2)L×U(1)Y of Glashow, Weinberg and Salam. The 1970 GIM paper noted that “in contradistinction to the conventional (three-quark) model, the couplings of the neutral intermediary – now hypercharge conserving – cause no embarrassment.”

The GIM mechanism has become a cornerstone of the Standard Model and it gives a precise description of the observed flavour changing neutral current processes for s and b quarks. For this reason, flavour-changing neutral currents are still an important benchmark and give strong constraints on theories that go beyond the Standard Model in the TeV region.

The post 50 years of the GIM mechanism appeared first on CERN Courier.

]]>
Meeting report A symposium to celebrate the fiftieth anniversary of Glashow, Iliopoulos and Maiani's explanation of the suppression of strangeness-changing neutral currents was held in Shanghai. https://cerncourier.com/wp-content/uploads/2020/01/MarchApril_meeting-report_feature.jpg
CERN’s proton synchrotron turns 60 https://cerncourier.com/a/cerns-proton-synchrotron-turns-60/ Fri, 10 Jan 2020 13:17:31 +0000 https://preview-courier.web.cern.ch/?p=86050 A colloquium was held at CERN on 25 November 2019.

The post CERN’s proton synchrotron turns 60 appeared first on CERN Courier.

]]>
Construction of the tunnel for the Proton Synchrotron

On 24 November 1959, CERN’s Proton Synchrotron (PS) first accelerated beams to an energy of 24 GeV. 60 years later, it is still at the heart of CERN’s accelerator complex, delivering beams to the fixed-target physics programme and the LHC with intensities exceeding the initial specifications by orders of magnitude. To celebrate the anniversary a colloquium was held at CERN on 25 November 2019, with PS alumni presenting important phases in the life of the accelerator.

The PS and its sister machine, Brookhaven’s Alternating Gradient Synchrotron, are the world’s oldest accelerators

The PS is CERN’s oldest operating accelerator, and, together with its sister machine, the Alternating Gradient Synchrotron (AGS) at Brookhaven National Laboratory in the US, one of the two oldest still operating accelerators in the world. Both designs are based on the innovative concept of the alternating gradient, or strong-focusing, principle developed by Ernest Courant, Milton Stanley Livingston, Hartland Snyder and John Blewett. This technique allowed a significant reduction in the size of the vacuum chambers and magnets, and unprecedented beam energies. In 1952 the CERN Council endorsed a study for a synchrotron based on the alternating-gradient principle, and construction of a machine with a design-energy range from 20 to 30 GeV was approved in October 1953. Its design, manufacture and construction took place from 1954 to 1959. Protons made first turns on 16 September 1959, and on 24 November beam was accelerated beyond transition and to an energy of 24 GeV. On 8 December the design energy of 28.3 GeV was reached and the design intensity exceeded, at 3 × 1010 protons per pulse.

The first combined-function magnet

The PS has proven to be a flexible design, with huge built-in potential. Though the first experiments were performed with internal targets, extractions to external targets were soon added to the design, and further innovative extraction schemes were added through the years. On the accelerator side, the intensity was progressively ramped up, with the commissioning of the PS Booster in 1972, the repeated increase of the injection energy, and many improvements in the PS itself. Through the years more and more users requested beam from the PS, for example the EAST area, antiproton physics, and a neutron time-of-flight facility.

With the commissioning of the ISR, SPS, LEP and LHC machines, the PS took on a new role as an injector of protons, anti­protons, leptons and ions, while continuing its own physics programme. A new challenge was the delivery of beams for the LHC: these beams need to be transversely very dense (“bright”), and have a longitudinal structure that is generated using the different radio-frequency systems of the PS, with the PS thereby contributing its fair share to the success of the LHC. And there are more challenges ahead. The LHC’s high-luminosity upgrade programme demands beam parameters out of reach for today’s injector complex, motivating the ambitious LHC Injectors Upgrade Project. Installations are now in full swing, and Run 3 will take CERN’s PS into a new parameter regime and into another interesting chapter in its life.

The post CERN’s proton synchrotron turns 60 appeared first on CERN Courier.

]]>
Meeting report A colloquium was held at CERN on 25 November 2019. https://cerncourier.com/wp-content/uploads/2020/01/CCJanFeb20_FN-PS2.jpg
Maxwell’s enduring legacy https://cerncourier.com/a/maxwells-enduring-legacy/ Tue, 26 Nov 2019 22:01:07 +0000 https://preview-courier.web.cern.ch/?p=85592 Malcom Longair has written a monumental account of the Cavendish Laboratory.

The post Maxwell’s enduring legacy appeared first on CERN Courier.

]]>
Longair Maxwell

In 1871, James Clerk Maxwell undertook the titanic enterprise of planning a new physics laboratory for the University of Cambridge from scratch. To avoid mistakes, he visited the Clarendon Laboratory in Oxford, and the laboratory of William Thomson (Lord Kelvin) in Glasgow – then the best research institutes in the country – to learn all that he could from their experiences. Almost 150 years later, Malcolm Longair, a renowned astrophysicist and the Cavendish laboratory’s head from 1997 to 2005, has written a monumental account of the scientific achievements of those who researched, worked and taught at a laboratory which has become an indispensable part of the machinery of modern science.

The 22 chapters of the book are organised in ten parts corresponding to the inspiring figures who led the laboratory through the years, most famously among them Maxwell himself, Thomson, Rutherford, Bragg, Mott and few others. The numerous Nobel laureates who spent part of their careers at the Cavendish are also nicely characterised, among them Chadwick, Appleton, Kapitsa, Cockcroft and Walton, Blackett, Watson and Crick, Cormack, and, last but not least Didier Queloz, Nobel Laureate in 2019 and professor at the universities of Cambridge and Geneva. You may even read about friends and collaborators as the exposition includes the most recent achievements of the laboratory.

Rutherford and Thomson managed the finances of the laboratory almost from their personal cheque book

Besides the accuracy of the scientific descriptions and the sharpness of the ideas, this book inaugurates a useful compromise that might inspire future science historians. So far it was customary to write biographies (or collected works) of leading scientists and extensive histories of various laboratories: here these two complementary aspects are happily married in a way that may lead to further insights on the genesis of crucial discoveries. Longair elucidates the physics with a competent care that is often difficult to find. His exciting accounts will stimulate an avalanche of thoughts on the development of modern science. By returning to a time when Rutherford and Thomson managed the finances of the laboratory almost from their personal cheque book, this book will stimulate readers to reflect on the interplay between science, management and technology.

History is often instrumental in understanding where we come from, but it cannot reliably predict directions for the future. Nevertheless the history of the Cavendish shows that lasting progress can come from diversity of opinion, the inclusiveness of practices and mutual respect between fundamental sciences. How can we sum up the secret of the scientific successes described in this book? A tentative recipe might be unity in necessary things, freedom in doubtful ones and respect for every honest scientific endeavour.

The post Maxwell’s enduring legacy appeared first on CERN Courier.

]]>
Review Malcom Longair has written a monumental account of the Cavendish Laboratory. https://cerncourier.com/wp-content/uploads/2019/11/Longair-Maxwell-preview.jpg
ATLAS – A 25 Year Insider Story of the LHC Experiment https://cerncourier.com/a/atlas-a-25-year-insider-story-of-the-lhc-experiment/ Tue, 26 Nov 2019 22:00:50 +0000 https://preview-courier.web.cern.ch/?p=85601 117 authors collaborated to write on diverse aspects of the ATLAS project.

The post ATLAS – A 25 Year Insider Story of the LHC Experiment appeared first on CERN Courier.

]]>
ATLAS A 25 Year Insider Story

ATLAS – A 25 Year Insider Story of the LHC Experiment is a comprehensive overview of one of the most complex and successful scientific endeavours ever undertaken. 117 authors collaborated to write on diverse aspects of the ATLAS project, ranging from the early days of the proto-collaboration, to the test-beam studies to verify detector concepts, the design, building and installation of the detector systems, building the event selection and computing environment required, forming the organisation, and finally summarising the harvest of physics gathered thus far. Some of the chapters cover topics that are discussed elsewhere – the description of the detector summarises more extensive journal publications, the major physics achievements have been covered in recent review articles and the organisational structure is discussed on the web – but this volume usefully brings these various aspects together in a single place with a unified treatment.

Despite the many authors who contributed to this book, the style and level of treatment is reasonably coherent. There are many figures and pictures that augment the text. Those showing detector elements that are now buried out of sight are important complements to the text descriptions: the pictures of circuit boards are less helpful, besides demonstrating that these electronics exist. A most engaging feature is the inclusion of one-page “stories” at the ends of the chapters, each giving some insight into the ups and downs of how the enterprise works. Among these vignettes we have such stories as the ATLAS collaboration week that almost no one attended and the spirit of camaraderie among the experimenters and accelerator operators at the daily 08:30 run meetings.

One could imagine several audiences for this book, and I suspect that, apart from ATLAS collaborators themselves, each reader will find different chapters most suited to their interests. The 26-page chapter “The ATLAS Detector Today” offers a more accessible overview for students just joining the collaboration than the 300-page journal publication referenced in most ATLAS publications. Similarly, “Towards the High-Luminosity LHC” gives a helpful brief introduction to the planned upgrades. “Building up the Collaboration” will be useful to historians of science seeking to understand how scientists, institutions and funding agencies engage in a project whose time is ripe. Those interested in project management will find “Detector Construction Around the World” illuminating: this chapter shows how the design and fabrication of detector subsystems is organised with several, often geographically disparate, institutions joining together, each contributing according to its unique talents. “From the LoI to the Detector Construction” and “Installation of the Detectors and Technical Coordination” will appeal to engineers and technical managers. The chapters “Towards the ATLAS Letter of Intent” and “From Test Beams to First Physics” catalogue the steps that were necessary to realise the collaboration and experiment, but whose details are primarily interesting to those who lived through those epochs. Finally, “Highlights of Physics Results (2010 – 2018)” could have offered an exciting story for non-scientists, and indeed the thrill of the chase for the Higgs boson comes through vividly, but with unexplained mentions of leptons, loops and quantum corrections, the treatment is at a more technical level than would be ideal for such readers, and the plots plucked from publications are not best suited to convey what was learned to non-physicists.

What makes a collaboration like this tick?

Given the words in the foreword that the book is “intended to provide an insider story covering all aspects of this global science project,” I looked forward to the final chapter, “ATLAS Collaboration: Life and its Place in Society”, to get a sense of the human dimensions of the collaboration. While some of that discussion is quite interesting – the collaboration’s demographics and the various outreach activities undertaken to engage the public – there is a missing element that I would have appreciated: what makes a collaboration like this tick? How did the large personalities involved manage to come to common decisions and compromises on the detector designs? How do physicists from nations and cultures that are at odds with each other on the world stage manage to work together constructively? How does one account for the distinct personalities that each large scientific collaboration acquires? Why does every eligible author sign all ATLAS papers, rather than just those who did the reported analysis? How does the politics for choosing the collaboration management work? Were there design choices that came to be regretted in the light of subsequent experience? In addition to the numerous successes, were there failures? Although I recognise that discussing these more intimate details runs counter to the spirit of such large collaborations, in which one seeks to damp out as much internal conflict as possible, examining some of them would have made for a more compelling book for the non-specialist.

The authors should be commended for writing a book unlike any other I know of. It brings together a factual account of all aspects of ATLAS’s first 25 years. Perhaps as time passes and the participants mellow, the companion story of the how, in addition to the what and where, will also be written.

The post ATLAS – A 25 Year Insider Story of the LHC Experiment appeared first on CERN Courier.

]]>
Review 117 authors collaborated to write on diverse aspects of the ATLAS project. https://cerncourier.com/wp-content/uploads/2019/06/Atlas-1.jpg
The Weil conjectures https://cerncourier.com/a/the-weil-conjectures/ Thu, 21 Nov 2019 13:53:08 +0000 https://preview-courier.web.cern.ch/?p=85231 Karen Olsson's book tours the lives of Simone Weil and her brother, the noted mathematician André Weil.

The post The Weil conjectures appeared first on CERN Courier.

]]>
The Weil Conjectures

“I am less interested in mathematics than in mathematicians,” wrote Simone Weil to her brother André, a world-class mathematician who was imprisoned in Rouen at the time. The same might be said about US novelist and onetime mathematics student Karen Olsson. Despite the title, her new book, The Weil Conjectures, stars the extraordinary siblings at the expense of André’s mathematical creation.

First conceived by André in prison, and finally proven three decades later by Pierre Deligne in 1974, the Weil conjectures are foundational pillars of algebraic geometry. Linking the continuous and the discrete, and the realms of topology and number theory, they are pertinent to efforts to unite gravity with the quantum theories of the other forces. Frustratingly, though, mathematical hobbyists hoping for insights into the conjectures will be disappointed by this book, which instead zeroes in on the people in orbit around the maths.

Olsson is particularly fascinated by Simone Weil. An iconoclastic public intellectual in France, and possessed by an intensely authentic humanity that the author presents as quite alien to André, Simone was nevertheless envious of her brother’s mathematical insight, writing that she “preferred to die than to live without that truth”. Olsson is clearly empathetic, and so, one would suspect, will be most readers in a profession where intellect is all. Whether one is a grad student or a foremost expert in the field, there is always someone smarter, whose insights seem inaccessible.

Physicists may also detect echoes of the current existential crisis in theoretical physics (see Redeeming the role of mathematics) in Simone’s thinking. While she feels that “unless one has exercised one’s mind at the gymnastics of mathematics, one is incapable of precise thought, which amounts to saying that one is good for nothing,” she criticises “the absolute dominion that is exercised over science by the most abstract forms of mathematics.”

Peppered with anecdotes about other mathematicians – Girolamo Cardano is described as a “total dick” – and more a succession of scenes than a biography, the book is as much about Olsson herself as the Weils. The prose zig-zags between vignettes from the author’s own life and the Weils without warning, leaving the reader to search for connections. Facts are unsourced, and readers are left to guess what is historical and what is the author’s impressionistic character portrait. Charming and quirky, the text transforms dusty perceptions of the meetings of the secret Bourbaki society of French mathematicians into scenes of lakeside debauchery and translucent camisoles that are almost reminiscent of Pride and Prejudice. Olsson even takes us into Simone’s dreams, with the conjectures only cropping up at the end of the book. If you limit your reading to the maths and the Weils, the resulting slim volume is a page turner.

The post The Weil conjectures appeared first on CERN Courier.

]]>
Review Karen Olsson's book tours the lives of Simone Weil and her brother, the noted mathematician André Weil. https://cerncourier.com/wp-content/uploads/2019/11/CCNovDec19_REV_Olsson_feature.jpg
Austria and CERN celebrate 60 years https://cerncourier.com/a/austria-and-cern-celebrate-60-years/ Tue, 12 Nov 2019 13:08:12 +0000 https://preview-courier.web.cern.ch/?p=85305 Associated in bygone days with the UA1 and DELPHI, today hundreds of Austrian scientists contribute to CERN’s experimental programme.

The post Austria and CERN celebrate 60 years appeared first on CERN Courier.

]]>
Manfred Krammer, Jochen Schieck and Fabiola Gianotti

Since joining in 1959, Austria has never stopped contributing to CERN. Associated in bygone days with the UA1 experiment at the SPS, where the W and Z bosons were discovered, and later with LEP’s DELPHI experiment, which helped to put the Standard Model on a solid footing, today hundreds of Austrian scientists contribute to CERN’s experimental programme, and its institutes participate in ALICE, ATLAS, CMS and in experiments at the Antiproton Decelerator. Two of the laboratory’s directors, Willibald Jentschke and Victor Frederick Weisskopf, were born in Austria.

To celebrate the 60th anniversary of Austria’s membership, the public were invited to “Meet the Universe” during a series of exhibitions and public events from 5–12 September, organised by the Institute of High Energy Physics (HEPHY) of the Austrian Academy of Sciences. CERN Director-General Fabiola Gianotti opened proceedings by discussing the role of particle colliders as tools for exploration. The following day, 2017 Nobel Prize winner Barry Barish presented his vision for gravitational-wave detectors and the dawn of multi-messenger astronomy. The programme continued with public lectures by Jon Butterworth of University College London, presenting the various experimental paths that could reveal hints for new physics, and Christoph Schwanda of HEPHY discussing the matter–antimatter asymmetry in the universe.

“We’d like to celebrate this important anniversary and continue to contribute to this long-term endeavour together with the other countries that participate in CERN’s research programme,” said Manfred Krammer, both of HEPHY and head of CERN’s experimental physics department.

The long-standing relationship with CERN has offered broad benefits to the Austrian scientific community, a noticeable example being the Vienna Conference on Instrumentation, and since 1993 the Austrian doctoral programme, which has now trained more than 200 participants, has been fully integrated with CERN’s PhD programme. Today, Austria’s collaboration with CERN extends far beyond particle physics. Business incubation centres were launched in Austria in 2015, and the MedAustron advanced hadron-therapy centre (CERN Courier September/October 2019 p10), which was developed in collaboration with CERN, is among the world’s leading medical research facilities.

“CERN is the place to push the frontiers, and scientists from Austria will contribute to make the next steps towards the unknown,” said HEPHY director Jochen Schieck.

The post Austria and CERN celebrate 60 years appeared first on CERN Courier.

]]>
Meeting report Associated in bygone days with the UA1 and DELPHI, today hundreds of Austrian scientists contribute to CERN’s experimental programme. https://cerncourier.com/wp-content/uploads/2019/11/CCNovDec19_FN_austria_feature.jpg
The rise of French particle physics https://cerncourier.com/a/the-rise-of-french-particle-physics/ Mon, 21 Oct 2019 12:43:36 +0000 https://preview-courier.web.cern.ch/?p=84851 Founded 80 years ago, the French National Centre for Scientific Research (CNRS) is one of Europe’s largest research institutions.

The post The rise of French particle physics appeared first on CERN Courier.

]]>
The interior of a radio-frequency cavity

Marie Curie once described the laboratory she shared with her husband Pierre as “just a clapboard hut with an asphalt floor and glass roof giving incomplete protection against the rain, without any amenities”. Even her colleagues abroad were shocked by their paltry resources. German chemist Wilhelm Ostwald noted: “The laboratory was a cross between a stable and a potato shed, and if it hadn’t been for the chemical apparatus, I would have thought it a practical joke.” In the 1920s, newspapers showed the desperate situation the French laboratories were in. “There are some in attics, others in cellars, others in the open air…” the Petit Journal newspaper reported in 1921. Increasing research funding to elevate France to the level of countries like Germany became a rallying point for the nation.

In the inter-war years, Jean Perrin, winner of the 1926 Nobel Prize in Physics for his work showing the existence of atoms, championed the development of science and had the support of many other scientists. Thanks to financial support from the Rothschild Foundation, he founded the Institute of Physico-Chemical Biology, the first place to employ researchers full-time. In 1935 he managed to get the National Scientific Research Fund set up to fund academic projects and research fellowships. One of its first fellows was the young Lew Kowarski in 1937, who had joined Frédéric Joliot-Curie’s laboratory at the Collège de France. In May 1939, together with Hans von Halban, they filed patents via the fund that outlined the production of nuclear power and the principle of the atomic bomb.

Around 32,000 people currently work at CNRS in collaboration with universities, private laboratories and other organisations

A new government under Léon Blum took office in 1936, and with it came the appointment of France’s first under-secretary of state for scientific research. Another first was the inclusion of three women in the government at a time when women still did not have the vote in France. Irène Joliot-Curie took up her post for three months in support of women’s rights and scientific research. During this short period, she set out major objectives: an increase in research budgets, salaries and grants for research fellows.

After her resignation, Perrin took over. This sexagenarian with the appearance of a dishevelled scientist “immediately showed the ardour of a young man and the enthusiasm of a beginner, not for the prestige, but for the means of action the post provided”, Jean Zay, the very young minister of national education at the time, noted in his memoirs. Over the next four years, his achievements included the opening of laboratories such as the Paris Institute of Astrophysics and culminated in the decree founding CNRS, published in October 1939. Six weeks after the outbreak of the Second World War, Perrin announced: “Science is not possible without freedom of thought, and freedom of thought cannot exist without freedom of conscience. You cannot require chemistry to be Marxist and expect to produce great chemists; you cannot require physics to be 100% Aryan and expect to keep the greatest physicists in your country… Each of us can die, but we want our ideals to live on.”

The founding principle of the CNRS “to identify and conduct, alone or with its partners, every type of research in the interest of science and the technological, social and cultural advancement of the country” stands strong today. Around 32,000 people, including 11,000 academics and researchers, currently work at CNRS in collaboration with universities, private laboratories and other organisations. Most of the 1100 CNRS laboratories are co-directed with a partner institution and host CNRS personnel and, in the majority of cases, faculties and academics. These “mixed research units”, which were introduced in 1966, form the backbone of French research and allow cutting-edge research to be done whilst being rooted in teaching and contact with students.

The evolution of nuclear and high-energy physics

CNRS founder Jean Perrin

Under the auspices of the national ministry of higher education, research and innovation, CNRS is France’s largest research institution. With an annual budget of €3.4 billion, it covers the whole gamut of scientific disciplines, from the humanities to natural and life sciences, the science of matter and the universe, and from fundamental to applied research. The disciplines are organised thematically into 10 institutes, which manage the scientific programmes and a significant share of the investment in research infrastructure. CNRS plays a coordination role, particularly through its three national institutes, the National Institute of Nuclear and Particle Physics (IN2P3), together with the National Institute of Sciences of the Universe and the National Institute for Mathematical Sciences and their Interactions.

When CNRS was founded, French physicists were among the world-leading: Irène and Frédéric Joliot-Curie, Jean Perrin, Louis de Broglie and Pierre Auger are among the names that have entered the history books from this time. Frédéric Joliot-Curie’s laboratory at the Collège de France played a crucial role thanks to its cyclotron, as did Irène Joliot-Curie’s Radium Institute, Louis Leprince-Ringuet’s laboratory at the École polytechnique and Jean Thibaud’s in Lyon. The newly established CNRS put up the funds for facilities, research fellows, technical personnel and chairs of nuclear physics at universities and the elite Grandes Écoles. The war broke out the same year CNRS was founded, bringing everything to a halt: researchers either went into exile or tried to continue running their laboratories in inevitable isolation.

Frédéric Joliot-Curie, buoyed by his involvement in the Resistance, took up the reins of CNRS in August 1944 and strove to help France catch up again after the war, particularly in nuclear physics. After atomic bombs were dropped on Hiroshima and Nagasaki, General de Gaulle asked Frédéric Joliot-Curie and Raoul Dautry, who was minister for reconstruction and urbanism, to set up the Commissariat à l’Energie Atomique (CEA). Frédéric Joliot- Curie saw this organisation as a means of bringing together and coordinating all fundamental research in nuclear physics, including research undertaken in university laboratories. From 1946, all the big names – Auger, Joliot-Curie, Perrin and Kowarski – joined CEA. CNRS was therefore not greatly involved in this area. In 1947 the decision was taken to build a facility in Saclay combining fundamental and applied research. André Berthelot was the director of the nuclear physics division at Saclay and installed several accelerators there.

Founding CERN

In the 1950s French physicists played a key role in the establishment of CERN: Louis de Broglie, the first well-known scientist to call for the creation of a multinational laboratory; Auger, who was director of the department of exact and natural sciences at UNESCO; Dautry, director-general of CEA; Perrin, the high commissioner; and Kowarski, one of CERN’s first staff members who later became director of scientific and technical services. He is credited with the construction of the first bubble chamber at CERN and the introduction of computers. Joliot-Curie, relieved of his duties at CEA owing to his political beliefs, was very upset not to be appointed to the CERN Council – unlike Perrin, who succeeded him at CEA. Alive to CERN’s potential, Louis Leprince-Ringuet shifted the focus of his teams’ research from cosmic rays to accelerators. He became the first French chair of the scientific policy committee (SPC) in 1964 and his laboratory contributed greatly to the involvement of French physicists at CERN.

Irène and Frédéric Joliot-Curie in 1935

Another CNRS recruit of the post-war period who also made a name for himself at CERN was Georges Charpak. Securing a research position at CNRS in 1948, he wrote his thesis under the supervision of Frédéric Joliot-Curie, who had wanted to nudge Charpak towards nuclear physics. But he picked his own area: detectors. He was hired at CERN by Leon Lederman in 1963 and went on to develop the multiwire proportional chamber, which replaced bubble chambers and spark chambers by enabling digital processing of the data. The invention won him the 1992 Nobel Prize in Physics.

When he returned to the Collège de France, Frédéric Joliot-Curie joined forces with Irène to create the Orsay campus. Given the prospect of new facilities at CERN, they felt that France needed to develop its own infrastructure to enable French physicists to train and prepare their experiments for CERN. “Helping to create and sustain CERN whilst letting fundamental nuclear-physics research fizzle out in France would be to act against the interests of our country and those of science,” Irène Joliot-Curie wrote in Le Monde. The government under Pierre Mendès France made research a priority and in 1954 granted funds for the construction of two accelerators, a synchrocyclotron at Irène Joliot-Curie’s Radium Institute and a linear accelerator for Yves Rocard’s Physics Laboratory at the École normale supérieure. Irène Joliot-Curie secured the plots required in Orsay for the construction of the Orsay Institute of Nuclear Physics (IPNO) and the Linear Accelerator Laboratory (LAL). Irène Joliot-Curie did not live to see the new laboratory, but Frédéric Joliot-Curie became the IPNO’s first director, while Hans Halban was called back from England to manage LAL. These two emblematic institutes still play a major role in French contributions to CERN.

From strength to strength

During the 1950s and 1960s CNRS went from strength to strength and set up more laboratories for high-energy and nuclear physics. A Cockroft-Walton accelerator, built in Strasbourg by the Germans during the war, was the seed that grew into the Nuclear Research Centre directed by Serge Gorodetsky. Maurice Scherer’s chair in nuclear physics in Caen, created in 1947, evolved into the Corpuscular Physics Laboratory. One of its first doctoral students, Louis Avan, founded the eponymous laboratory in Clermont-Ferrand in 1959. Louis Néel laid the foundations for important physics research in Grenoble, and CEA set up a centre for nuclear studies there in 1956. The Franco-German research reactor was built at the “Institut Laue-Langevin” in Grenoble in 1967. In the same year, the Grenoble Nuclear Science Institute was established, hosting a cyclotron used in particular for the production of radioisotopes for medicine. Its director, Jean Yoccoz, later became a director of IN2P3. The Centre of Nuclear Studies of Bordeaux-Gradignan was established in a disused Bordeaux château in 1969.

The physicists at these laboratories played an active role in CERN experiments, thanks in particular to the flexible secondment policy at CNRS. Among them was Bernard Gregory, who worked at Leprince-Ringuet’s laboratory and focused on the construction of a large, 81 cm liquid-hydrogen bubble chamber in Saclay in preparation for the impending commissioning of the Proton Synchrotron (PS) at CERN. It produced more than 10 million pictures of particle interactions, which were shared all over Europe. In 1965 Gregory became the Director-General of CERN. Five years later, he replaced Louis Leprince-Ringuet as the head of the École polytechnique laboratory, and then became director general of CNRS. He was elected President of the CERN Council in 1977.

Managing expansion

In the 1960s research facilities were becoming so large that the idea came about within CNRS to create national institutes to coordinate the laboratories’ resources and programmes. LAL director André Blanc-Lapierre campaigned for a National Institute of Nuclear and Particle Physics, following the example of the Italian INFN founded in 1951. The aim was to organise the funding allocated to the various laboratories by CNRS, the universities and CEA. Discussions between the partners then began.

The Italian AdA collider

In parallel, French physicists were engaged in another debate. After the construction of the 3 GeV proton accelerator SATURNE at CEA in Saclay in 1958 and the 1.3 GeV electron linear accelerator at LAL in Orsay in 1962, as well as the later ACO collider at ALA in Orsay, opinions were divided about building a national machine that would complement CERN’s experimental capabilities and strengthen the French scientific community. Two options were in the running: a proton machine and an electron machine. This decision was especially important since other machines were springing up elsewhere in Europe. In Italy, the electron–positron collider AdA was followed by ADONE in 1969. In Hamburg, Germany, the electron synchrotron DESY was commissioned in 1964.

France’s priority, however, was construction at the European level with CERN, so neither of the two proposed projects ever got off the ground. Jean Teillac, who succeeded Frédéric Joliot-Curie as head of the IPNO, founded IN2P3 in 1971, federating the laboratories and the universities of CNRS. It was only later, in 1975, that CEA and IN2P3 decided to collaborate in building a national machine in Caen, the Large Heavy Ion National Accelerator (“Grand Accélérateur National d’Ions Lourds”, GANIL), which specialised in nuclear physics. Despite the fact that the CEA laboratories involved were not part of IN2P3, the physicists of the two organisations collaborated extensively.

In this context, André Lagarrigue, who had been the director of LAL since 1969, proposed the construction of a new bubble chamber, Gargamelle, on a neutrino beam at CERN. The scientist had previously investigated the feasibility of bubble chambers containing heavy liquids that would favour interactions with neutrinos instead of hydrogen at the École polytechnique. After its construction at Saclay, the chamber filled with liquid freon was installed at CERN and detected neutral currents in 1973. This was a major discovery, which certainly would have won Lagarrigue a Nobel prize had he not died of a heart attack in 1975.

Then to now

André Lagarrigue in front of the Gargamelle bubble chamber at CERN

Today, IN2P3 has around 20 laboratories and some 3200 staff, including 1000 academics and researchers in nuclear physics, particle physics, astroparticle physics and cosmology. The institute contributes to the development of accelerators, detectors and observation instruments and their applications to societal needs. Its data centre in Lyon plays an important role in processing and storing large volumes of data, as well as housing the digital infrastructure for other disciplines.

IN2P3 has had strong links with CERN through many projects and experiments. These include: the discovery of the W and Z bosons by UA1 and UA2; contributions to ALEPH, DELPHI and L3 at LEP; the discovery of the Higgs boson by ATLAS and CMS at the LHC; flavour studies at LHCb and heavy-ion physics at ALICE; neutrino physics; CP violation; antimatter experiments, as well as nuclear physics. These joint ventures also involve other CNRS institutes like the INP (Institute of Physics), with its specialists in quantum physics and lasers, as well as in strong magnetic fields.

Future CERN projects are currently being discussed in the update of the European strategy for particle physics. They offer the prospect of new collaborations between CERN and CNRS in high-energy physics, but also in engineering, computing, biomedical applications and even the humanities and social sciences. No doubt the synergy between these two organisations, with their exceptionally rich scientific knowledge, will continue to give birth to exciting new research.

  • A French version of this article is available here.

The post The rise of French particle physics appeared first on CERN Courier.

]]>
Feature Founded 80 years ago, the French National Centre for Scientific Research (CNRS) is one of Europe’s largest research institutions. https://cerncourier.com/wp-content/uploads/2019/10/CCNovDec19_CNRS_frontis.jpg
L’essor de la physique des particules en France https://cerncourier.com/a/lessor-de-la-physique-des-particules-en-france/ Mon, 21 Oct 2019 12:41:48 +0000 https://preview-courier.web.cern.ch/?p=84860 Fondé il y a 80 ans, le Centre national de la recherche scientifique (CNRS), en France, est l’une des plus importantes institutions de recherche en Europe.

The post L’essor de la physique des particules en France appeared first on CERN Courier.

]]>
L’intérieur d’une cavité de radiofréquence

Se souvient-on des images du laboratoire de Pierre et Marie Curie ? «Ce n’est qu’une baraque en planches, au sol bitumé et au toit vitré, protégeant incomplètement contre la pluie, dépourvue de tout aménagement », selon Marie Curie. Même ses collègues étrangers se désolent alors du peu de moyens dont ils disposent. Le chimiste allemand Wilhelm Ostwald déclare : « Ce laboratoire tenait à la fois de l’étable et du hangar à pommes de terre. Si je n’y avais pas vu des appareils de chimie, j’aurais cru que l’on se moquait de moi ». Dans les années 1920, des journaux témoignent de la misère des laboratoires. « Il y en a dans les greniers, d’autres dans des caves, d’autres en plein air… », rapporte le Petit Journal en 1921. Augmenter les moyens de la recherche pour se mettre au niveau de pays comme l’Allemagne devient une cause nationale.

Entre les deux guerres, Jean Perrin, prix Nobel de physique 1926 pour ses travaux sur l’existence des atomes, s’engage pour le développement de la science, avec le soutien de nombreux scientifiques.  Grâce à des financements de la Fondation Rothschild, il crée l’Institut de biologie physico-chimique, où travaillent pour la première fois des chercheurs à temps plein. En 1935, il obtient la création de la Caisse nationale de la recherche scientifique qui finance des projets universitaires et des bourses de chercheurs. L’un de ses premiers boursiers en 1937 est le jeune Lew Kowarski, issu du laboratoire de Frédéric Joliot-Curie au Collège de France. En mai 1939, ils déposent avec Hans von Halban, via la Caisse, les brevets qui esquissent la production d’énergie nucléaire et le principe de la bombe atomique.

Avec l’arrivée du gouvernement de Léon Blum en 1936, un sous-secrétaire d’État à la recherche est nommé pour la première fois. Autre première : trois femmes intègrent le gouvernement à une époque où elles n’avaient pas encore le droit de vote en France. Irène Joliot-Curie accepte ce poste pour trois mois afin de soutenir la cause féminine et celle de la recherche scientifique. Pendant cette courte période, elle définira des orientations majeures : une augmentation des budgets de la recherche, des salaires et des bourses de chercheurs.

Environ 32000 personnes travaillent aujourd’hui au CNRS en collaboration avec des universités, des laboratoires privés et d’autres organisations

À sa démission, Jean Perrin lui succède. Avec son image de scientifique hirsute, le sexagénaire « déploya aussitôt la fougue d’un jeune homme, l’enthousiasme d’un débutant, non pour les honneurs, mais pour les moyens d’action qu’ils fournissaient », note dans ses mémoires Jean Zay, le très jeune ministre de l’éducation nationale d’alors. Après quatre ans de réalisations, dont la création de laboratoires comme l’Institut d’astrophysique de Paris, le décret fondant le CNRS est publié en octobre 1939. Six semaines après le début de la deuxième guerre mondiale, Jean Perrin annonce : « Il n’est pas de science possible où la pensée n’est pas libre, et la pensée ne peut pas être libre sans que la conscience soit également libre. On ne peut pas imposer à la chimie d’être marxiste, et en même temps favoriser le développement des grands chimistes ; on ne peut pas imposer à la physique d’être cent pour cent aryenne et garder sur son territoire le plus grand des physiciens… Chacun de nous peut bien mourir, mais nous voulons que notre idéal vive. »

La mission du CNRS est encore aujourd’hui d« identifier, effectuer ou faire effectuer, seul ou avec ses partenaires, toutes les recherches présentant un intérêt pour la science ainsi que pour le progrès technologique, social et culturel du pays ». Environ 32 000 personnes, dont 11 000 chercheurs, travaillent au CNRS en collaboration avec les universités, d’autres organismes ou des laboratoires privés. La plupart des 1100 laboratoires du CNRS sont gérés en cotutelle avec un établissement partenaire Ils accueillent du personnel CNRS et, dans la majorité des cas, des enseignants-chercheurs. Ces unités mixtes de recherche, dont le statut date de 1966, constituent les briques de la recherche française et permettent de mener des recherches pointues tout en restant proche de l’enseignement et des étudiants.

L’évolution de la physique nucléaire et des hautes énergies

Jean Perrin

Placé sous la tutelle du ministère de l’Enseignement supérieur, de la recherche et de l’innovation, le CNRS est le plus grand organisme de recherche en France. Avec un budget annuel de 3,4 milliards d’euros, il couvre l’ensemble des recherches scientifiques, des humanités aux sciences de la nature et de la vie, de la matière et de l’Univers, de la recherche fondamentale aux applications. Les disciplines sont organisées en dix instituts thématiques qui gèrent les programmes scientifiques ainsi qu’une importante partie des investissements dans les infrastructures de recherche, comme les contributions de ses laboratoires aux expériences du CERN. Il joue un rôle de coordination, en particulier à travers ses trois instituts nationaux, dont l’IN2P3 (Institut national de physique nucléaire et physique des particules) aux cotés des instituts nationaux des sciences de l’Univers et des mathématiques.

À la création du CNRS, la physique française est au meilleur niveau mondial : Irène et Frédéric Joliot-Curie, Jean Perrin, Louis de Broglie, Pierre Auger sont parmi les noms entrés dans l’histoire de la discipline. Le laboratoire de Frédéric Joliot-Curie au Collège de France joue un rôle important grâce à son cyclotron, de même que l’Institut du radium de Irène Joliot-Curie, le laboratoire de Louis Leprince-Ringuet à l’École polytechnique, ou encore celui de Jean Thibaud à Lyon. Des équipements, des boursiers et du personnel technique, des chaires en physique nucléaire dans les universités et les grandes écoles sont financés par le tout jeune CNRS. Avec la guerre, une véritable césure se produit : les chercheurs s’exilent ou tentent de continuer à faire fonctionner leurs laboratoires dans un isolement certain.

Fort de son engagement dans la résistance, Fréderic Joliot-Curie, prend la direction du CNRS en août 1944 et œuvre pour que la France rattrape le retard accumulé pendant la guerre, notamment en physique nucléaire. Après le lancement des bombes atomiques sur Hiroshima et Nagasaki, le Général de Gaulle demande à Frédéric Joliot-Curie et à Raoul Dautry, ministre de la reconstruction et de l’urbanisme, de mettre en place le Commissariat à l’énergie atomique (CEA). Dans l’idée de Joliot-Curie, cet organisme allait rassembler et coordonner toutes les recherches fondamentales de physique nucléaire, y compris celles des laboratoires universitaires. Dès 1946, les grands noms rejoignent le CEA : Pierre Auger, Irène Joliot-Curie, Francis Perrin, Lew Kowarski. Le CNRS se préoccupe alors peu de ce domaine. En 1947, la décision est prise de construire un centre à Saclay qui couple les recherches fondamentales et appliquées sur ce sujet. André Berthelot y dirigera le service de physique nucléaire et y installera plusieurs accélérateurs.

La création du CERN

Dans les années 1950, les physiciens français jouent un rôle important dans la création du CERN : Louis de Broglie, le premier scientifique de renom à demander la création d’un laboratoire multinational lors d’une conférence de Lausanne en 1949, Pierre Auger qui dirige le département des sciences exactes et naturelles de l’UNESCO, Raoul Dautry, l’administrateur général du CEA, Francis Perrin, haut-commissaire, et Lew Kowarski, l’un des premiers employés du CERN et qui deviendra plus tard le directeur des services techniques et scientifiques. On lui doit la construction de la première chambre à bulles au CERN et l’introduction des ordinateurs. Frédéric Joliot-Curie, révoqué en 1950 de ses fonctions au CEA pour ses convictions politiques, est quant à lui très affecté de ne pas être nommé au Conseil du CERN, à l’inverse de Francis Perrin, qui lui a succédé au CEA. Conscient des potentialités du CERN, Louis Leprince-Ringuet réoriente les recherches de ses équipes portant sur les rayons cosmiques vers les accélérateurs. Il deviendra le premier président français du Comité des directives scientifiques (SPC) en 1964 et son laboratoire jouera un rôle important dans l’implication des physiciens français au CERN.

Irène et Frédéric Joliot-Curie en 1935

Une autre recrue du CNRS de l’après-guerre fera également parler de lui au CERN : Georges Charpak. Admis au CNRS comme chercheur en 1948, il réalise sa thèse sous la direction de Frédéric Joliot-Curie. Alors que ce dernier veut l’orienter vers la physique nucléaire, il choisit son propre sujet : les détecteurs. En 1963, il est recruté par Leon Lederman au CERN. La suite est connue : il met au point la chambre proportionnelle « multi-fils » qui remplace les chambres à bulles et les chambres à étincelles en permettant un traitement numérique des données. L’invention lui vaut le prix Nobel de physique en 1992.

A son retour au Collège de France, Frédéric Joliot-Curie s’engage auprès d’Irène dans la création du campus d’Orsay. Avec la perspective de nouvelles installations au CERN, des infrastructures en France leur semblent nécessaires pour permettre aux physiciens français de se former et de préparer leurs expériences au CERN. « Contribuer à créer et à faire vivre le CERN en laissant s’éteindre la recherche fondamentale française en physique nucléaire serait agir contre les intérêts de notre pays et contre ceux de la science », écrit Irène Joliot-Curie dans « Le Monde ». Le gouvernement de Pierre Mendès France donne une priorité à la recherche et alloue en 1954 des crédits pour la construction de deux accélérateurs, un synchrocyclotron dans l’Institut du radium d’Irène Joliot-Curie, et un accélérateur linéaire pour le Laboratoire de physique de Yves Rocard à l’École normale supérieure. Irène Joliot-Curie obtient les terrains nécessaires à Orsay pour la construction de l’Institut de physique nucléaire (IPNO) et le Laboratoire de l’accélérateur linéaire (LAL). Irène Joliot-Curie ne verra pas le nouveau laboratoire et c’est Fréderic Joliot-Curie qui devient le premier directeur de l’IPNO et Hans Halban, rappelé d’Angleterre, prend la direction du LAL. Ces deux instituts emblématiques jouent encore un rôle majeur pour les contributions françaises au CERN.

L’éclosion des laboratoires

Pendant les années 1950-1960, le CNRS connaît un fort développement et d’autres laboratoires de physique nucléaire et des hautes énergies sont créés. Un accélérateur Cockroft-Walton construit pendant la guerre à Strasbourg par les Allemands sera le germe du Centre de recherches nucléaires, dirigé par Serge Gorodetzky. Créée en 1947, la chaire de Maurice Scherer en physique nucléaire à Caen devient le Laboratoire de physique corpusculaire. L’un de ses premiers thésards, Louis Avan, fondera un laboratoire du même nom à Clermont-Ferrand en 1959. A Grenoble, Louis Néel pose les fondations d’une importante activité de recherche en physique et le CEA y installera le Centre d’études nucléaires en 1956. En 1967, le réacteur de recherche franco-allemand de l’Institut Laue-Langevin y est construit. La même année, l’Institut des sciences nucléaires de Grenoble voit le jour : il accueillera un cyclotron, utilisé en particulier pour produire des radio-isotopes en médecine. Son directeur, Jean Yoccoz, sera l’un des futurs directeurs de l’IN2P3. Le Centre d’études nucléaires de Bordeaux-Gradignan s’installe dans un ancien château bordelais en 1969.

Les physiciens de ces laboratoires participent activement aux expériences au CERN, bénéficiant en particulier d’une mobilité facilitée par le CNRS. Parmi eux, Bernard Gregory, du laboratoire de Leprince-Ringuet, s’oriente, en vue de la prochaine mise en service du Synchrotron à protons (PS) du CERN, vers la construction à Saclay d’une grande chambre à bulles à hydrogène liquide de 81 centimètres. Elle produira plus de dix millions de clichés d’interactions de particules, distribués à travers toute l’Europe. En 1965, Bernard Gregory est désigné directeur général du CERN. Cinq ans plus tard, il succède à Louis Leprince-Ringuet à la direction du laboratoire de l’École polytechnique, puis devient directeur général du CNRS. Il est élu président du Conseil en 1977.

Gérer l’expansion

Dans les années 1960, les équipements de recherche deviennent si imposants qu’émerge au sein du CNRS l’idée de créer des instituts nationaux pour coordonner les ressources et les activités des laboratoires. Le directeur du LAL, André Blanc-Lapierre, milite pour la création d’un institut national de physique nucléaire et de physique des particules, à l’instar de l’INFN italien fondé en 1951. Il s’agit d’organiser les moyens alloués aux différents laboratoires par le CNRS, les universités et le CEA : les discussions entre les partenaires commencent.

Le collisionneur italien AdA

Parallèlement, un autre débat anime les physiciens français. Après la construction en 1958 de l’accélérateur de protons SATURNE de 3 GeV au CEA à Saclay et, en 1962, de l’accélérateur linéaire à électrons de 1,3 GeV au LAL à Orsay, et la construction du collisionneur électron-positron ACO, les esprits se divisent sur la construction d’une machine nationale qui complèterait les capacités expérimentales du CERN et renforcerait la communauté scientifique française. Deux propositions sont en lice : une machine à protons et une machine à électrons. D’autant qu’en Europe d’autres machines sont sorties de terre. En Italie, le collisionneur électron-positon AdA est suivi en 1969 par ADONE. À Hambourg en Allemagne, le synchrotron à électrons DESY démarre en 1964.

La France en revanche donne la priorité à la construction européenne avec le CERN. Aucun des deux projets proposés ne voit donc le jour. Jean Teillac, successeur de Frédéric Joliot-Curie à la tête de l’IPNO, fonde l’IN2P3 en 1971, regroupant les laboratoires du CNRS et des universités.  Il faudra attendre 1975 pour que le CEA et l’IN2P3 décident de construire ensemble à Caen une machine nationale, le Grand accélérateur national d’ions lourds (GANIL), spécialisé en physique nucléaire. Bien que les laboratoires concernés du CEA ne fassent pas partie de l’IN2P3, les collaborations entre les physiciens des deux organismes sont importantes.

Ainsi, André Lagarrigue, directeur du LAL depuis 1969, propose la construction d’une nouvelle chambre à bulles, Gargamelle, sur un faisceau de neutrinos du CERN. Le scientifique avait exploré auparavant à l’École polytechnique la faisabilité de chambres à bulles contenant des liquides lourds au lieu de l’hydrogène, favorisant les interactions avec des neutrinos. Après sa construction au CEA Saclay, la chambre remplie de fréon liquide est installée au CERN et décèlera en 1973 les courants neutres. Une découverte majeure, certainement nobélisable si Lagarrigue n’avait pas succombé à une crise cardiaque en 1975.

Depuis lors

André Lagarrigue

L’IN2P3 compte aujourd’hui une vingtaine de laboratoires, environ 3200 personnes dont 1000 chercheurs et enseignants-chercheurs dans les domaines de la physique nucléaire, des particules et des astroparticules ainsi qu’en cosmologie. L’Institut contribue au développement d’accélérateurs, de détecteurs et d’instruments d’observation et leurs applications. Son centre de calcul à Lyon joue un rôle important dans le traitement et le stockage de grands volumes de données, hébergeant par ailleurs des infrastructures numériques d’autres disciplines.

Les liens avec le CERN sont forts à travers des nombreux projets et expériences : la découverte des bosons W et Z par UA1 et UA2, le LEP avec des contributions à ALEPH, DELPHI et L3, la découverte du boson de Higgs par ATLAS et CMS au LHC, LHCb et ALICE, la physique des neutrinos, la violation de CP, les expériences sur l’antimatière, ainsi que la physique nucléaire. Les collaborations impliquent d’autres instituts du CNRS comme l’INP (Institut de physique), auquel sont rattachés des théoriciens, ainsi que les spécialistes de la physique quantique et des lasers, ou encore les recherches des champs magnétiques intenses.

Et la suite ? Les futurs projets du CERN sont en discussion à l’occasion de la mise à jour de la stratégie européenne pour la physique des particules. Ils offrent la possibilité de faire émerger de nouvelles collaborations entre le CERN et le CNRS, en physique mais aussi en ingénierie, en calcul, dans les applications biomédicales ou, pourquoi pas, en sciences humaines. Sans aucun doute, de la synergie entre ces deux organismes porteurs d’une richesse scientifique exceptionnelle, de nouvelles recherches passionnantes verront le jour !

  • La version anglaise de cet article est disponible ici.

The post L’essor de la physique des particules en France appeared first on CERN Courier.

]]>
Feature Fondé il y a 80 ans, le Centre national de la recherche scientifique (CNRS), en France, est l’une des plus importantes institutions de recherche en Europe. https://cerncourier.com/wp-content/uploads/2019/10/CCNovDec19_CNRS_frontis-1.jpg
LEP’s electroweak leap https://cerncourier.com/a/leps-electroweak-leap/ Wed, 11 Sep 2019 10:41:29 +0000 https://preview-courier.web.cern.ch/?p=84246 In the autumn of 1989 the Large Electron Positron collider (LEP) delivered the first of several results that still dominate the landscape of particle physics today.

The post LEP’s electroweak leap appeared first on CERN Courier.

]]>
Trailblazing events

In the early 1970s the term “Standard Model” did not yet exist – physicists used “Weinberg–Salam model” instead. But the discovery of the weak neutral current in Gargamelle at CERN in 1973, followed by the prediction and observation of particles composed of charm quarks at Brookhaven and SLAC, quickly shifted the focus of particle physicists from the strong to the electroweak interactions – a sector in which trailblazing theoretical work had quietly taken place in the previous years. Plans for an electron–positron collider at CERN were soon born, with the machine first named LEP (Large Electron Positron collider) in a 1976 CERN yellow report authored by a distinguished study group featuring, among others, John Ellis, Burt Richter, Carlo Rubbia and Jack Steinberger.

LEP’s size – four times larger than anything before it – was chosen from the need to observe W-pair production, and to check that its cross section did not diverge as a function of energy. The phenomenology of the Z-boson’s decay was to come under similar scrutiny. At the time, the number of fermion families was undefined, and it was even possible that there were so many neutrino families that the Z lineshape would be washed out. LEP’s other physics targets included the possibility of producing Higgs bosons. At the time, the mass of the Higgs boson was completely unknown and could have been anywhere from around zero to 1 TeV.

The CERN Council approved LEP in October 1981 for centre-of-mass energies up to 110 GeV. It was a remarkable vote of confidence in the Standard Model (SM), given that the W and Z bosons had not yet been directly observed. A frantic period followed, with the ALEPH, DELPHI, L3 and OPAL detectors approved in 1983. Based on similar geometric principles, they included drift chambers or TPCs for the main trackers, BGO crystals, lead–glass or lead–gas sandwich electromagnetic calorimeters, and, in most cases, an instrumented return yoke for hadron calorimetry and muon filtering. The underground caverns were finished in 1988 and the detectors were in various stages of installation by the end of spring 1989, by which time the storage ring had been installed in the 27 km-circumference tunnel (see The greatest lepton collider).

Expedition to the Z pole

The first destination was the Z pole at an energy of around 90 GeV. Its location was then known to ±300 MeV from measurements of proton–antiproton collisions at Fermilab’s Tevatron. The priority was to establish the number of light neutrino families, a number that not only closely relates to the number of elementary fermions but also impacts the chemical composition and large-scale structure of the universe. By 1989 the existence of the νe, νμ and ντ neutrinos was well established. Several model-dependent measurements from astrophysics and collider physics at the time had pointed to the number of light active neutrinos (Nν) being less than five, but the SM could, in principle, accommodate any higher number.

The OPAL logbook entry for the first Z boson seen at LEP

The initial plan to measure Nν using the total width of the Z resonance was quickly discarded in favour of the visible peak cross section, where the effect was far more prominent – and in first approximation, insensitive to new possible detectable channels. The LEP experiments were therefore thrown in at the deep end, needing to make an absolute cross-section measurement with completely new detectors in an unfamiliar environment that demanded triggers, tracking, calorimetry and the luminosity monitors to all work and acquire data in synchronisation.

On the evening of 13 August, during a first low-luminosity pilot run just one month after LEP achieved first turns, OPAL reported the first observation of a Z decay (see OPAL fruits). Each experiment quickly observed a handful more. The first Z-production run took place from 18 September to 9 October, with the four experiments accumulating about 3000 visible Z decays each. They took data at the Z peak and at 1 and 2 GeV either side, improving the precision on the Z mass and allowing a measurement of the peak cross section. The results, including those from the Mark II collaboration at SLAC’s linear electron–positron SLC collider, were published and presented in CERN’s overflowing main auditorium on 13 October.

After only three weeks of data taking and 10,000 Z decays, the number of neutrinos was found to be three. In the following years, some 17 million Z decays were accumulated, and cross-section measurement uncertainties fell to the per-mille level. And while the final LEP number – Nν = 2.9840 ± 0.0082 – may appear to be a needlessly precise measurement of the number three (figure 1a), it today serves as by far the best high-energy constraint on the unitarity of the neutrino mixing matrix. LEP’s stash of a million clean tau pairs from Z → τ+ τ– decays also allowed the universality of the lepton–neutrino couplings to the weak charged current to be tested with unprecedented precision. The present averages are still dominated by the LEP numbers: gτ/gμ = 1.0010 ± 0.0015 and gτ/ge = 1.0029 ± 0.0015.

Diagrams showing measurements at LEP

LEP continued to carry out Z-lineshape scans until 1991, and repeated them in 1993 and 1995. Two thirds of the total luminosity was recorded at the Z pole. As statistical uncertainties on the Z’s parameters went down, the experiments were challenged to control systematic uncertainties, especially in the experimental acceptance and luminosity. Monte Carlo modelling of fragmentation and hadronisation was gradually improved by tuning to measurements in data. On the luminosity front it soon became clear that dedicated monitors would be needed to measure small-angle Bhabha scattering (e+e e+e), which proceeds at a much higher rate than Z production. The trick was to design a compact electromagnetic calorimeter with sufficient position resolution to define the geometric acceptance, and to compare this to calculations of the Bhabha cross section.

The final ingredient for LEP’s extraordinary precision was a detailed knowledge of the beam energy, which required the four experiments to work closely with accelerator experts. Curiously, the first energy calibration was performed in 1990 by circulating protons in the LEP ring – the first protons to orbit in what would eventually become the LHC tunnel, but at a meagre energy of 20 GeV. The speed of the protons was inferred by comparing the radio-frequency electric field needed to keep protons and electrons circulating at 20 GeV on the same orbit, allowing a measurement of the total magnetic bending field on which the beam energy depends. This gave a 20 MeV uncertainty on the Z mass. To reduce this to 1.7 MeV for the final Z-pole measurement, however, required the use of resonant depolarisation routinely during data taking. First achieved in 1991, this technique uses the natural transverse spin polarisation of the beams to yield an instantaneous measurement of the beam energy to a precision of ±0.1 MeV – so precise that it revealed minute effects caused, for example, by Earth’s tides and the passage of local trains (see Tidal forces, melting ice and the TGV to Paris). The final precision was more than 10 times better than had been anticipated in pre-LEP studies.

Electroweak working group

The LEP electroweak working group saw the ALEPH, DELPHI, L3 and OPAL collaborations work closely on combined cross-section and other key measurements – in particular the forward-backward asymmetry in lepton and b-quark production – at each energy point. By 1994, results from the SLD collaboration at SLAC were also included. Detailed negotiations were sometimes needed to agree on a common treatment of statistical correlations and systematic uncertainties, setting a precedent for future inter-experiment cooperation. Many tests of the SM were performed, including tests of lepton universality (figure 1b), adding to the tau lepton results already mentioned. Analyses also demonstrated that the couplings of leptons and quarks are consistent with the SM predictions.

The combined electroweak measurements were used to make stunning predictions of the top-quark and Higgs-boson masses, mt and mH. After the 1993 Z-pole scan, the LEP experiments were able to produce a combined measurement of the Z width with a precision of 3 MeV in time for the 1994 winter conferences, allowing the prediction mt = 177 ± 13 ± 19 GeV where the first error is experimental and the second is due to mH not being known. A month later the CDF collaboration at the Tevatron announced the possible existence of a top quark with a mass of 176 ± 16 GeV. Both CDF and its companion experiment D0 reached 5σ “discovery” significance a year later. It is a measure of the complexity of the Z-boson analyses (in particular the beam-energy measurement) that the final Z-pole results were published a full 11 years later, constraining the Higgs mass to be less than 285 GeV at 95% confidence level (figure 1c), with a best fit at 129 GeV.

From QCD to the W boson

LEP’s fame in the field tends to concern its electroweak breakthroughs. But, with several million recorded hadronic Z decays, the LEP experiments also made big advances in quantum chromodynamics (QCD). These results significantly increased knowledge of hadron production and quark and gluon dynamics, and drove theoretical and experimental methods that are still used extensively today. LEP’s advantage as a lepton collider was to have an initial state that was independent of nucleon structure functions, allowing the measurement of a single, energy-scale-dependent coupling constant. The strong coupling constant αs was determined to be 0.1195 ± 0.0034 at the Z pole, and to vary with energy – the highlight of LEP’s QCD measurements. This so-called running of αs was verified over a large energy range, from the tau mass up to 206 GeV, yielding additional experimental confirmation of QCD’s core property of asymptotic freedom (figure 2a).

Diagrams showing LEP results

Many other important QCD measurements were performed, such as the gluon self-coupling, studies of differences between quark and gluon jets, verification of the running b-quark mass, studies of hadronisation models, measurements of Bose–Einstein correlations and detailed studies of hadronic systems in two-photon scattering processes. The full set of measurements established QCD as a consistent theory that accurately describes the phenomenology of the strong interaction.

Following successful Z operations during the “LEP1” phase in 1989–1995, a second LEP era devoted to accurate studies of W-boson pair production at centre-of-mass energies above 160 GeV got under way. Away from the Z resonance, the electron-positron annihilation cross section decreases sharply; as soon as the centre-of-mass energy reaches twice the W and Z boson masses, the WW, then ZZ, production diagrams open up (figure 2b). Accessing the WW threshold required the development of superconducting radio-frequency cavities, the first of which were already installed in 1994, and they enabled a gradual increase in the centre-of-mass energy up to a maximum of 209 GeV in 2000.

The “LEP2” phase allowed the experiments to perform a signature analysis, which dated back to the first conception of the machine: the measurement of the WW-boson cross section. Would it diverge or would electroweak diagrams interfere to suppress it? The precise measurement of the WW cross section as a function of the centre-of-mass energy was a very important test of the SM since it showed that the sum and interference of three four-fermion processes were indeed acting in the WW production: the t-channel ν exchange, and the s-channel γ and Z exchange (figure 2c). LEP data proved that the γWW and ZWW triple gauge vertexes are indeed present and interfere destructively with the t-channel diagram, suppressing the cross section and stopping it from diverging.

The second key LEP2 electroweak measurement was of the mass and total decay width of the W boson, which were determined by directly reconstructing the decay products of the two W bosons in the fully hadronic (W+W qqqq) and semi-leptonic (W+W qqℓν) decay channels. The combined LEP W-mass measurement from direct reconstruction data alone is 80.375 ± 0.025(stat) ± 0.022(syst) GeV, the largest contribution to the systematic uncertainties originating from fragmentation and hadronisation uncertainties. The relation between the Z-pole observables, mt and mW, provides a stringent test of the SM and constrains the Higgs mass.

To the Higgs and beyond

Before LEP started, the mass of the Higgs boson was basically unknown. In the simplest version of the SM, involving a single Higgs boson, the only robust constraints were its non-observation in nuclear decays (forbidding masses below 14 MeV) and the need to maintain a sensible, calculable theory (ruling out masses above 1 TeV). In 1990, soon after the first LEP data-taking period, the full Higgs-boson mass range below 24 GeV was excluded at 95% confidence level by the LEP experiments. Above this mass the main decay of the Higgs boson, occurring 80% of the time, was predicted to be its decays into b quark–antiquark pairs, followed by pairs of tau leptons, charm quarks or gluons, while the WW* decay mode starts to contribute at the maximum reachable masses of approximately 115 GeV. The main production process is Higgs-strahlung, whereby a Higgs is emitted by a virtual Z boson.

The combined electroweak measurements were used to make stunning predictions of the top quark and Higgs boson masses

During the full lifetime of LEP, the four experiments kept searching for neutral and charged Higgs bosons in several models and exclusion limits continued to improve. In its last year of data taking, when the centre-of-mass energy reached 209 GeV, ALEPH reported an excess of four-jet events. It was consistent with a 114 GeV Higgs boson and had a significance that varied as the data were accumulated, peaking at an instantaneous significance of around 3.9 standard deviations. The other three experiments carefully scrutinised their data to confirm or disprove ALEPH’s suggestion, but none observed any long-lasting excess in that mass region. Following many discussions, the LEP run was extended until 8 November 2000. However, it was decided not to keep running the following year so as not to impact the LHC schedule. The final LEP-wide combination excluded, at 95% confidence level, a SM Higgs boson with mass below 114.4 GeV.

The four LEP experiments carried out many other searches for novel physics that set limits on the existence of new particles. Notable cases are the searches for additional Higgs bosons in two-Higgs-doublet models and their minimal supersymmetric incarnation. Neutral scalar and pseudoscalar Higgs bosons lighter than the Z boson and charged Higgs bosons up to the kinematic limit of their pair production were also excluded. Supersymmetric particles suffered a similar fate, in the theoretically attractive assumption of R-parity conservation. The existence of sleptons and charginos was excluded in the largest part of the parameter space for masses below 70–100 GeV, near the kinematic limit for their pair production. Neutralinos with masses below approximately half the Z-boson mass were also excluded in a large part of the parameter space. The LEP exclusions for several of these electroweak-produced supersymmetric particles are still the most stringent and most model-independent limits ever obtained.

It is very hard to remember how little we knew before LEP and the giant step that LEP made. It was often said that LEP discovered electroweak radiative corrections at the level of 5σ, opening up a precision era in particle physics that continues to set the standard today and offer guidance on the elusive new physics beyond the SM.

The post LEP’s electroweak leap appeared first on CERN Courier.

]]>
Feature In the autumn of 1989 the Large Electron Positron collider (LEP) delivered the first of several results that still dominate the landscape of particle physics today. https://cerncourier.com/wp-content/uploads/2019/09/CCSepOct19_LEP30_featureNEW.jpg
The greatest lepton collider https://cerncourier.com/a/the-greatest-lepton-collider/ Wed, 11 Sep 2019 10:36:48 +0000 https://preview-courier.web.cern.ch/?p=84286 LEP was the highest energy e+e– collider ever built, with levels of precision that remain unsurpassed in accelerator physics. Former CERN director of accelerators Steve Myers tells LEP’s story from conception to its emotional final day.

The post The greatest lepton collider appeared first on CERN Courier.

]]>
A quadrupole next to one of the long dipole magnets in LEP

A few minutes before midnight on a summer’s evening in July 1989, 30 or so people were crammed into a back room at CERN’s Prévessin site in the French countryside. After years of painstaking design and construction, we were charged with breathing life into the largest particle accelerator ever built. The ring was complete, the aperture finally clear and the positron beam made a full turn on our first attempt. Minutes later beams were circulating, and a month later the first Z boson event was observed. Here began a remarkable journey that firmly established the still indefatigable Standard Model of particle physics.

So, what can go wrong when you’re operating 27 kilometres of particle accelerator, with ultra-relativistic leptons whizzing around the ring 11,250 times a second? The list is long. The LEP ring was packed with magnets, power converters, a vacuum system, a control system, a cryogenics system, a cooling and ventilation system, beam instrumentation – and much more. Then there was the control system, fibres, networks, routers, gateways, software, databases, separators, kickers, beam dump, radio-frequency (RF) cavities, klystrons, high-voltage systems, interlocks, synchronisation, timing, feedback… And, of course, the experiments, the experimenters and everybody’s ability to get along in a high-pressure environment.

LEP wasn’t the only game in town. There was fierce competition from the more innovative Stanford Linear Collider (SLC) in California. But LEP was off to a fantastic start and its luminosity increase was much faster than at its relatively untested linear counterpart. A short article capturing the transatlantic rivalry appeared in the Economist on 19 August 1989. “The results from California are impressive,” the magazine reported, “especially as they come from a new and unique type of machine. They may provide a sure answer to the generation problem before LEP does. This explains the haste with which the finishing touches have been applied to LEP. The 27 km-long device, six years in the making, was transformed from inert hardware to working machine in just four weeks – a prodigious feat, unthinkable anywhere but at CERN. Even so, it was still not as quick as Carlo Rubbia, CERN’s domineering director-general might have liked.”

Notes from the underground

LEP’s design dates from the late 1970s, the project being led by accelerator-theory group leader Eberhard Keil, RF group leader Wolfgang Schnell and C J “Kees” Zilverschoon. The first decision to be made was the circumference of the tunnel, with four options on the table: a 30 km ring that went deep into the Jura mountains, a 22 km ring that avoided them entirely, and two variants with a length of 26.7 km that grazed the outskirts of the mountains. Then director-general Herwig Schopper decided on a circumference of 26.7 km with an eye on a future proton collider for which it would be “decisive to have as large a tunnel as possible” (CERN Courier July/August 2019 p39). The final design was approved on 30 October 1981 with Emilio Picasso leading the project. Construction of the tunnel started in 1983, after a standard public enquiry in France.

Blasting the LEP tunnel under the Jura mountains

LEP’s tunnel, the longest-ever attempted prior to the Channel Tunnel, which links France and Britain, was carved by three tunnel-boring machines. Disaster struck just two kilometres into the three-kilometre stretch of tunnel in the foothills of the Jura, where the rock had to be blasted because it was not suitable for boring. Water burst in and formed an underground river that took six months to eliminate (figure 1). By June 1987, however, part of the tunnel was complete and ready for the accelerator to be installed.

Just five months after the difficult excavation under the Jura, one eighth of the accelerator (octant 8) had been completely installed, and, a few minutes before midnight on 12 July 1988, four bunches of positrons made the first successful journey from the town of Meyrin in Switzerland (point 1) to the village of Sergy in France (point 2), a distance of 2.5 km. Crucially, the “octant test” revealed a significant betatron coupling between the transverse planes: a thin magnetised nickel layer inside the vacuum chambers was causing interference between the horizontal and vertical focusing of the beams. The quadrupole magnets were adjusted to prevent a resonant reinforcement of the effect each turn, and the nickel was eventually demagnetised.

Giving birth to LEP

The following months saw a huge effort to install equipment in the remaining 24 km of the tunnel – magnets, vacuum chambers and RF cavities, as well as beam instrumentation, injection equipment, electrostatic separators, electrical cabling, water cooling, ventilation and all the rest. This was followed by conditioning the cavities, baking out and leak-testing the vacuum chambers, and individual testing. At the same time a great deal of effort to prepare the software needed to operate the collider was made with limited resources.

In the late 1980s, control systems for accelerators were going through a major transition to the PC. LEP was caught up in the mess and there were many differences of opinion on how to design LEP’s control system. As July 1989 approached, the control system was not ready and a small team was recruited to implement the bare minimum controls required to inject beam and ramp up the energy. Unable to hone key parameters such as the tune and orbit corrections before beam was injected, we had two major concerns: is the beam aperture clear of all obstacles, and are there any polarity errors in the connections of the many thousand magnetic elements? So we nominated a “Mr Polarity”, whose job was to check all polarities in the ring. This may sound trivial, but with thousands of connections it was a huge task.

Tidal forces, melting ice and the TGV to Paris

Diagrams showing variations in LEP’s beam-energy resolution

LEP’s beam-energy resolution was so precise that it was possible to observe distortion of the 27 km ring by a single millimetre, whether due to the tidal forces of the Sun and Moon, or the seasonal distortion caused by rain and meltwater from the nearby mountains filling up Lac Léman and weighing down one side of the ring. In 1993 we noticed even more peculiar random variations on the energy signal during the day – with the exception of a few hours in the middle of the night when the signal was noise free. Everybody had their own pet theory. I believed it was some sort of effect coming from planes interacting with the electrical supply cables. Some nights later I could be seen sitting in a car park on the Jura at 2 a.m., trying to prove my theory with visual observations, but it was very dark and all the planes had stopped landing several hours beforehand. Experiment inconclusive! The real culprit, the TGV (a high-speed train), was discovered by accident a few weeks later during a discussion with a railway engineer: leakage currents on the French rail track flowed through the LEP vacuum chamber with the return path via the Versoix river back to Cornavin. The noise hadn’t been evident when we first measured the beam energy as TGV workers had been on strike.

At a quarter to midnight on 14 July 1989, the aperture was free of obstacles and the beam made its first turn on our first attempt. Soon afterwards we managed to achieve a circulating beam, and we were ready to fine tune the multitude of parameters needed to prepare the beams for physics.

The goal for the first phase of LEP was electron–positron collisions at a total energy of 91 GeV – the mass of the neutral carrier of the weak force, the Z boson. LEP was to be a true Z factory, delivering millions of Zs for precision tests of the Standard Model. To mass-produce them required beams not only of high energy but also of high intensity, and delivering them required four steps. The first was to accumulate the highest possible beam current at 20 GeV – the injection energy. This was a major operation in itself, involving LEP’s purpose-built injection linac and electron–positron accumulator, the Proton Synchrotron, the Super Proton Synchrotron (SPS) and, finally, transfer lines to inject electrons and positrons in opposite directions – these curved not only horizontally but also vertically as LEP and the SPS were at different heights. The second step was to ramp up the accumulated current to the energy of the Z resonance with minimal losses. Thirdly, the beam had to be “squeezed” to improve the collision rate at the interaction regions by changing the focusing of the quadrupoles on either side of the experiments, thereby reducing the transverse cross section of the beam at the collision points.

Following the highly successful first turn on 14 July 1989, we spent the next month preparing for the first physics run. Exactly a month later, on 13 August, the beams collided for the first time. The following 10 minutes seemed like an eternity since none of the four experiments – ALEPH, DELPHI, L3 and OPAL – reported any events. I was in the control room with Emilio Picasso and we were beginning to doubt that the beams were actually colliding when Aldo Michelini called from OPAL with the long-awaited comment: “We have the first Z0!” ALEPH and OPAL physicists had connected the Z signal to a bell that sounded on the arrival of the particle in their detectors. While OPAL’s bell rang proudly, ALEPH’s was silent, leading to a barrage of complaints before it became apparent that they were waiting for the collimators to close before turning on their sub detectors. As the luminosity rose during the subsequent period of machine studies the bells became extremely annoying and were switched off.

From the Z pole to the WW threshold

Physicists in front of the final superconducting RF-cavity module to be installed

The first physics run began on 20 September 1989, with LEP’s total energy tuned for five days to the Z mass peak at 91 GeV, providing enough integrated luminosity to generate 1400 Zs in each experiment. A second period followed, this time with the energy scanned through the width of the Z at five different beam energies: at the peak and ±1 GeV and ±2 GeV to either side, allowing the experiments to measure the width of the Z resonance. First physics results were announced on 13 October, just three months after the final testing of the accelerator’s components (see LEP’s electroweak leap).

LEP dwelt at the Z peak from 1989 to 1995, during which time the four experiments each observed approximately 4.5 million Z decays. In 1995 a major upgrade dubbed LEP2 saw the installation of 288 superconducting cavities (figure 2), enabling LEP to sit at or near the WW threshold of 161 GeV for the following five years. The maximum beam energy reached was 104.4 GeV. There was also a continuous effort to increase the luminosity by increasing the number of bunches, reducing the emittance by adjusting the focusing, and squeezing the bunches more tightly at the interaction points, with LEP’s performance ultimately limited by the nonlinear forces of the beam–beam interaction – the perturbations of the beams as they cross the opposing beam. LEP surpassed every one of its design parameters (figure 3).

Life as a LEP accelerator physicist

Being an accelerator physicist at LEP took heart as well as brains. The sisyphean daily task of coaxing the seemingly temperamental machine to optimal performance even led us to develop an emotional attachment to it. Challenges were unpredictable, such as for the engineers dispatched on a fact-finding mission to ascertain the cause of an electrical short circuit, only to discover two deer, “Romeo and Juliet”, locked in a lover’s embrace having bitten through a cable, or the discovery of sabotage with beer bottles (see The bizarre episode of the bottles in the beampipe). The aim, however, was clear: inject as much current as possible into both beams, ramp the energy up to 45 GeV, squeeze the beam size down at the collision points, collide and then spend a few hours delivering events to the experiments. The reality was hours of furious concentration, optimisation, and, in the early days, frustrating disappointment.

Diagram showing LEP’s integrated luminosity

In the early years, filling LEP was a delicate hour-long process of parameter adjustment, tweaking and coaxing the beam into the machine. On a good day we would see the beam wobble alarmingly on the UV telescopes, lose a bit and watch the rest struggle up the ramp. On a bad day, futile attempt after futile attempt, most of the beam would disappear without warning in the first few seconds of the ramp. The process used to last minutes and there was nothing you could do. We would stand there, watching the lifetime buck and dip, and the painstakingly injected beam would either slowly or quickly drift out of the machine. The price of failure was a turn around and refill. Success brought the opportunity to chance the squeeze – an equally hazardous manoeuvre whereby the interaction-point focusing magnets were adjusted to reduce the beam size – and then perhaps a physics fill, and a period of relative calm. At this stage the focus would move to the experimental particle physicists on shift at the four experiments. Each had their own particular collective character, and their own way of dealing with us. We verged between being accommodating, belligerent, maverick, dedicated, professional and very occasionally hopelessly amateur – sometimes all within the span of a single shift, depending on the attendant pressures.

Table showing LEP

The experiment teams paraded their operational efficiency numbers – plus complaints or congratulations – at twice weekly scheduling meetings. Well run and disciplined, ALEPH almost always had the highest efficiency figures; their appearances at scheduling meetings nearly always a simple statement of 97.8% or thereabouts. This was livened in later years by the repeated appearance of their coordinator Bolek Pietrzyk, who congratulated us each time we stepped up in energy or luminosity with a strong, Polish-accented, “Congratulations! You have achieved the highest energy electron–positron collisions in the universe!”, which was always gratifying. Equally professional, but more relaxed, was OPAL, which had a strong British and German contingent. These guys understood human nature. Quite simply, they bribed us. Every time we passed a luminosity target or hit a new energy record they’d turn up in the control room with champagne or crates of German beer. Naturally we’d do anything for them, happily moving heaven and earth to resolve their problems. L3 and DELPHI had their own quirks. DELPHI, for example, ran their detector as a “state machine”, whose status changed automatically based on signals from the accelerator control room. All well and good, but they depended on us to change the mode to “dump beam” at the end of a fill, something that was occasionally skipped, leaving DELPHI’s subdetectors on and them ringing us desperately for a mode change. Baffled DELPHI students on shift would ask what was going on. Filling and ramping were demanding periods during the operational sequence and a lot of concentration was required. The experiment teams did well not to ring and make too many demands at this stage – requests were occasionally rebuffed with a brusque response.

On the verge of a great discovery?

LEP’s days were never fated to dwindle. Early on, CERN had a plan to install the LHC in the same tunnel, in a bid to scan ever higher energies and be the first to discover the Higgs boson. However, on 14 June 2000, LEP’s final year of scheduled running, the ALEPH experiment reported a possible Higgs event during operations at a centre-of-mass energy of 206.7 GeV. It was consistent with “Higgs-strahlung”, whereby a Z radiates a Higgs boson, which was expected to dominate Higgs-boson production in e+e collisions at LEP2 energies. On 31 July and 21 August ALEPH reported second and third events corresponding to a putative reconstructed Higgs mass in the range 114–115 GeV.

The bizarre episode of the bottles in the beampipe

The bottles in the beampipe

The story of the sabotage of LEP has grown in the retelling, but I was there in June 1996, hurrying back early from a conference to help the machine operators, who had been struggling to circulate a beam for several days. After exhausting other possibilities, it became clear that there was an obstruction in the vacuum pipe, and we detected the location using the beam position system. It appeared to be around point 1 (where ATLAS now sits), so we opened the vacuum seal and took a look inside the beampipe using mirrors and endoscopes. Not seeing anything, I frustratedly squeezed my head between the vacuum flanges and peered down inside the pipe. In the distance was something resembling a green concave lens. “This looks like the bottom of a beer bottle,” I thought, restraining myself from uttering a word to anyone in the vicinity. I went to the opposite open end of the vacuum section and peered into the vacuum pipe again: a green circular disk this time, but again, not a word. Someone got a long pole to poke out the offending article – out it came, and my guess was correct: it was a Heineken beer bottle, which had indeed refreshed the parts no other beer could reach, as the slogan ran. A hasty search revealed a second bottle. Upon closer inspection it was clear that the control room operators had almost succeeded in making the beam circulate despite the obstacles: there was a scorch burn along the label, indicating that they had almost managed to steer the beam past the bottles. If there had only been one they may have succeeded. The Swiss police interviewed me concerning this act of sabotage  but the culprit was never unmasked.

LEP was scheduled to stop in mid-September with two weeks of reserve time granted to the LEP experiments to see if new Higgs-like events would appear. After the reserve weeks, ALEPH requested two months more running to double its integrated luminosity. One was granted, yielding a 50% increase in the accumulated data, and ALEPH presented an update of their results on 10 October: the signal excess had increased to 2.6σ. Things were really heating up, and on 16 October L3 announced a missing-energy candidate. By now the accelerator team was pushing LEP to its limits, to squeeze out every ounce of physics data in the service of the experiments’ search for the elusive Higgs. At the LEP committee meeting on 3 November, ALEPH presented new data that confirmed their excess once again – it had now grown to 2.9σ. A request to extend LEP running by one year was made to the LEPC. There was gridlock, and no unanimous recommendation could be made.

All of CERN was discussing the proposed running of LEP in 2001 to get final evidence of a possible discovery of the Higgs boson. Arguments against included delays to the start of the LHC of up to three years. There was also concern that Fermilab’s Tevatron would beat the LHC to the discovery of the Higgs, and mundane but practical arguments about the transfer of human resources to the LHC and the impact on the materials budget, including electricity costs. The impending closure of LEP, when many of us thought we were about to discover the Higgs, was perceived like the death of a dear friend by most of the LEP-ers. After each of the public debates on the subject a group of us would meet in some local pub, drink a few beers, curse the disbelievers and cry on each other’s shoulders. This was the only “civil war” that I saw in my 43 years at CERN.

LEP’s final moments before being decommissioned and replaced by the LHC

The CERN research board met again on 7 November and again there was deadlock, with the vote split eight votes to eight. The next day, then director-general Luciano Maiani announced that LEP had closed for the last time. It was a deeply unpopular decision, but history has shown it to be correct: the Higgs was discovered at the LHC 12 years later, with a mass of not 115 but 125 GeV. LEP’s closure allowed a massive redeployment of skilled staff, and the experience gained for the first time in running large accelerators went on to prove essential to the safe and efficient operation of the LHC.

When LEP was finally laid to rest we met one last time for an official wake (figure 4). After the machine was dismantled, requiring the removal to the surface of around 30,000 tonnes of material, some of the magnets and RF units were shipped to other labs for use in new projects. Today, LEP’s concrete magnet casings can still be seen scattered around CERN as shielding units for antimatter and fixed-target experiments, and even as road barriers.

LEP was the highest energy e+e collider ever built. Its legacy was and is extremely important for present and future colliders. The quality and precision of the physics data remain unsurpassed in luminosity, energy and energy calibration. It is the reference for any future e+e-ring collider design.

The post The greatest lepton collider appeared first on CERN Courier.

]]>
Feature LEP was the highest energy e+e– collider ever built, with levels of precision that remain unsurpassed in accelerator physics. Former CERN director of accelerators Steve Myers tells LEP’s story from conception to its emotional final day. https://cerncourier.com/wp-content/uploads/2019/09/CCSepOct19_legacy_frontis.jpg
CERN and the Higgs Boson https://cerncourier.com/a/cern-and-the-higgs-boson/ Mon, 09 Sep 2019 12:33:28 +0000 https://preview-courier.web.cern.ch/?p=84369 James Gillies’ slim volume conveys the excitement of the hunt for the Higgs.

The post CERN and the Higgs Boson appeared first on CERN Courier.

]]>
CERN and the Higgs Boson, by James Gillies

James Gillies’ slim volume CERN and the Higgs Boson conveys the sheer excitement of the hunt for the eponymous particle. It is a hunt that had its origins at the beginning of the last century, with the discovery of the electron, quantum mechanics and relativity, and which was only completed in the first decades of the next. It is also a hunt throughout which CERN’s science, technology and culture grew in importance. Gillies has produced a lively and enthusiastic text that explores the historical, theoretical, experimental, technical and political aspects of the search for the Higgs boson without going into oppressive scientific detail. It is rare that one comes across a monograph as good as this.

Gillies draws attention to the many interplays and dialectics that led to our present understanding of the Higgs boson. First of all, he brings to light the scientific issues associated with the basic constituents of matter, and the forces and interactions that give rise to the Standard Model. Secondly, he highlights the symbiotic relationship between theoretical and experimental research, each leading the other in turn, and taking the subject forward. Finally, he shows the inter-development of the accelerators, detectors and experimental methods to which massive computing power had eventually to be added. This is all coloured by a liberal sprinkling of anecdotes about the people that made it all possible.

Complementing this is the story of CERN, both as a laboratory and as an institution, traced over the past 60 years or so, through to its current pre-eminent standing. Throughout the book the reader learns just how important the people involved really are to the enterprise: their sheer pleasure, their commitment through the inevitable ups and downs, and their ability to collaborate and compete in the best of ways.

A ripping yarn, then, which it might seem churlish to criticise. But then again, that is the job of a reviewer. There is, perhaps, an excessively glossy presentation of progress, and the exposition continues forward apace without conveying the many downs of cutting-edge research: the technical difficulties and the many immensely hard and difficult decisions that have to be made during such enormous endeavours. Doing science is great fun but also very difficult – but then what are challenges for?

There is, perhaps, an excessively glossy presentation of progress

A pertinent example in the Higgs-boson story not emphasised in the book occurred in 2000. The Large Electron Positron collider (LEP) was due to be closed down to make way for the LHC, but late in the year LEP’s ALEPH detector recorded evidence suggesting a Higgs boson might be being observed at a mass of 114–115 GeV – although, unfortunately, not seen by the other experiments (see p32). Exactly this situation had been envisaged when not one but four LEP experiments were approved in the 1980s. After considerable discussion LEP’s closure went ahead, much to the unhappiness and anger of a large group of scientists who believed they were on the verge of a great discovery. This made for a very difficult environment at CERN for a considerable time thereafter. We now know the Higgs was found at the LHC with a mass of 125 GeV, vindicating the original decision of 2000.

A few more pictures might help the text and fix the various contributors in readers’ minds, though clearly the book, part of a series of short volumes by Icon Books called Hot Science, is formatted for brevity. I also found the positioning of the important material on applications such as positron emission tomography and the world wide web to be unfortunate, situated as it is in the final chapter, entitled “What’s the use?” Perhaps instead the book could have ended on a more upbeat note by returning to the excitement of the science and technology, and the enthusiasm of the people who were inspired to make the discovery happen.

CERN and the Higgs Boson is a jolly good read and recommended to everyone. Whilst far from the first book on the Higgs boson, Gillies’ offering distinguishes itself with its concise history and the insider perspective available to him as CERN’s head of communications from 2003 to 2015: the denouement of the hunt for the Higgs.

The post CERN and the Higgs Boson appeared first on CERN Courier.

]]>
Review James Gillies’ slim volume conveys the excitement of the hunt for the Higgs. https://cerncourier.com/wp-content/uploads/2019/09/higgsfield.jpg
Music of the muons https://cerncourier.com/a/music-of-the-muons/ Fri, 30 Aug 2019 13:22:10 +0000 https://preview-courier.web.cern.ch/?p=84156 Music and dance troupe Les Atomes Dansants use muon tracks from W, Z and Higgs events in CMS data to explore the links between science and art.

The post Music of the muons appeared first on CERN Courier.

]]>
Subatomic Desire

Swiss composer Alexandre Traube and the Genevan video-performer Silvia Fabiani have collaborated to form music and dance troupe Les Atomes Dansants, with the aims of using CMS data to explore the links between science and art, and of establishing a dialogue between Eastern and Western culture. Premiering their show Subatomic Desire at CERN’s Globe of Science and Innovation on 21 June during Geneva’s annual Fête de la Musique, they took the act to the detector that served as their muse by performing in the hangar above the CMS experiment.

Muon tracks from W, Z and Higgs events served as inspiration for Traube, who was advised by CMS physicist Chiara Mariotti of INFN. He began by associating segments of the CMS’s muon system to notes. Inspired by the detectors’ arrangement as four nested dodecagons, he assigned a note from the chromatic scale to each of the 12 sides of the innermost layer, and to each note a sonorous perfect fourth above to the corresponding segment in the outer layer. Developing an initial plan to also link the intermediate two layers of the muon system to specific frequencies, he associated two intermediate microtonal notes to the transverse momentum and rapidity of the tracks. At several moments during the performance the musicians improvise using the resulting four-note sequences: an expression of quantum indeterminacy, according to Traube. Fabiani’s video projections add to the surreal atmosphere by transposing the sequences into colours, with an animation of bullets referencing the Russian Second World War navy shells that were used to build the CMS’s hadronic calorimeter.

Clad in lab coat, Einstein wig and reversed baseball cap, Doc MC Carré raps formulas and boogies around the stage

In concert with the audiovisual display, three performers sing about their love for the microcosm. Clad in lab coat, Einstein wig and reversed baseball cap, Doc MC Carré (David Charles) raps formulas and boogies around the stage. He is accompanied by Doc Lady Emmy, played by the soprano Marie-Najma Thomas, and Poète Atomique – the Persian singer Taghi Akhabari – who peppers the performance with mystical extracts from Sufi poets Rûmi and Attâr, and medieval German abbess Hildegard of Bingen, each of whom explores themes of the natural world in their writings. The performers contend that the lyrics speak about desire as the fuel for everything at the micro- and macroscale. Elaborate, contemporary and rich in metaphors, this is an experience that some will find abstruse but others will love.

Subatomic Desire will next be performed in Neuchâtel, Switzerland on 14 September.

The post Music of the muons appeared first on CERN Courier.

]]>
Review Music and dance troupe Les Atomes Dansants use muon tracks from W, Z and Higgs events in CMS data to explore the links between science and art. https://cerncourier.com/wp-content/uploads/2019/08/Subatomic-desire-zoom-lowres.jpg
Lessons from LEP https://cerncourier.com/a/lessons-from-lep/ Thu, 11 Jul 2019 10:35:55 +0000 https://preview-courier.web.cern.ch?p=83648 The Large Electron Positron collider changed particle physics forever. As the field eyes up the next major collider, former CERN Director-General Herwig Schopper describes what it took to make LEP happen.

The post Lessons from LEP appeared first on CERN Courier.

]]>

When was the first LEP proposal made, and by whom?

Discussions on how to organise a “world accelerator” took place at a pre-ICFA committee in New Orleans in the early 1970s. The talks went on for a long time, but nothing much came out of them. In 1978 John Adams and Leon Van Hove – the two CERN Director-Generals (DGs) at the time – agreed to build an electron–positron collider at CERN. There was worldwide support, but then there came competition from the US, worried that they might lose the edge in high-energy physics. Formal discussions about a Superconducting Supercollider (SSC) had already begun. While it was open to international contribution, Ronald Reagan’s “join it or not” approach to the SSC, and other reasons, put other countries off the project.

Was there scientific consensus for a collider four times bigger than anything before it?

Yes. The W and Z bosons hadn’t yet been discovered, but there were already strong indications that they were there. Measuring the electroweak bosons in detail was the guiding force for LEP. There was also the hunt for the Higgs and the top quark, yet there was no guidance on the masses of these particles. LEP was proposed in two phases, first to sit at the Z pole and then the WW threshold. We made the straight sections as long as possible so we could increase the energy during the LEP2 phase.

What about political consensus?

The first proposal for LEP was initially refused by the CERN Council because it had a 30 km circumference and cost 1.4 billion Swiss Francs. When I was appointed DG in February 1979, they asked me to sit down with both current DGs and make a common proposal, which we did. This was the proposal with the idea to make it 22 km in circumference. At that time CERN had a “basic” programme (which all Member States had to pay for) and a “special” programme whereby additional funds were sought. The latter was how the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS) were built. But the cost of LEP made some Member States hesitate because they were worried that it would eat too much into the resources of CERN and national projects.

How was the situation resolved?

After long discussions, Council said: yes, you build it, but do so within a constant budget. It seemed like an impossible task because the CERN budget had peaked before I took over and it was already in decline. I was advised by some senior colleagues to resign because it was not possible to build LEP on a constant budget. So we found another idea: make advance payments and create debts. Council said we can’t make debts with a bank, so we raided the CERN pension fund instead. They agreed happily since I had to guarantee them 6% interest, and as soon as LEP was built we started to pay it back. With the LHC, CERN had to do the same (the only difference was that Council said we could go to a bank). CERN is still operating within essentially the same constant budget today (apart from compensation for inflation), with the number of users having more than doubled – a remarkable achievement! To get LEP approved, I also had to say to Council that CERN would fund the machine and others would fund the experiments. Before LEP, it was usual for CERN to pay for experiments. We also had to stop several activities like the ISR and the BEBC bubble chamber. So LEP changed CERN completely.

How do LEP’s findings compare with what was expected?

It was wonderful to see the W and Z discovered at the SPS while LEP was being built. Of course, we were disappointed that the Higgs and the top were not discovered. But, look, these things just weren’t known then. When I was at DESY, we spent 5 million Deutsche Marks to increase the radio-frequency power of the PETRA collider because theorists had guaranteed that the top quark would be lighter than 25 GeV! At LEP2 it was completely unknown what it would find.

What is LEP’s physics legacy?

These days, there is a climate where everything that is not a peak is not a discovery. People often say “not much came out from LEP”. That is completely wrong. What people forget is that LEP changed high-energy physics from a 10% to a 1% science. Apart from establishing the existence of three neutrino flavours, the LEP experiments enabled predictions of the top-quark mass that were confirmed at Fermilab’s Tevatron. This is because LEP was measuring the radiative corrections – the essential element that shows the Standard Model is a renormalisable theory, as shown theoretically by ’t Hooft and Veltman. It also showed that the strong coupling constant, αs, runs with energy and allowed the coupling constants of all the gauge forces to be extrapolated to the Planck mass – where they do not meet. To my mind, this is the most concrete experimental evidence that the Standard Model doesn’t work, that there is something beyond it.

How did the idea come about to put a proton collider in the LEP tunnel?

When LEP was conceived, the Higgs was far in the future and nobody was really talking about it. When the LEP tunnel was discussed, it was only the competition with SSC. The question was: who would win the race to go to higher energy? It was clear in the long run that the proton machine would win, so we had the famous workshop in Lausanne in 1983 where we discussed the possibility of putting a proton collider in the LEP tunnel. It was foreseen then to put it on top of LEP and to have them running at the same time. With the LHC, we couldn’t compete in energy with the SSC so we went for higher luminosities. But when we looked into this, we realised we had to make the tunnel bigger. The original proposal, as approved by Council in October 1981, had a tunnel size of 22 km and making it bigger was a big problem because of the geology – basically we couldn’t go too far under the Jura mountains. Nevertheless, I decided to go to 27 km against the advice of most colleagues and some advisory committees, a decision that delayed LEP by about a year because of the water in the tunnel. But it is almost forgotten that the LEP tunnel size was only chosen in view of the LHC.

Are there parallels with CERN today concerning what comes next after the LHC?

Yes and no. One of the big differences compared to the LEP days is that, back then, the population around CERN did not know what we were doing – the policy of management was not to explain what we are doing because it is “too complicated”! I was very surprised to learn this when I arrived as DG, so we had many hundreds of meetings with the local community. There was a misunderstanding about the word “nuclear” in CERN’s name – they thought we were involved in generating nuclear power. That fortunately has completely changed and CERN is accepted in the area.

What is different concerns the physics. We are in a situation more similar to the 1970s before the famous J/ψ discovery when we had no indications from theory where to go. People were talking about all sorts of completely new ideas back then. Whatever one builds now is penetrating into unknown territory. One cannot be sure we will find something because there are no predictions of any thresholds.

What wisdom can today’s decision-makers take from the LEP experience?

In the end I think that the next machine has to be a world facility. The strange thing is that CERN formally is still a European lab. There are associates and countries who contribute in kind, which allows them to participate, but the boring things like staff salaries and electricity have to be paid for by the Member States. One therefore has to find out whether the next collider can be built under a constant budget or whether one has to change the constitutional model of CERN. In the end I think the next collider has to be a proton machine. Maybe the LEP approach of beginning with an electron–positron collider in a new tunnel would work. I wouldn’t exclude it. I don’t believe that an electron–positron linear collider would satisfy requests for a world machine as its energy will be lower than for a proton collider, and because it has just one interaction point. Whatever the next project is, it should be based on new technology such as higher field superconducting magnets, and not be just a bigger version of the LHC. Costs have gone up and I think the next collider will not fly without new technologies.

You were born before the Schrödinger equation and retired when LEP switched on in 1989. What have been the highs and lows of your remarkable career?

I was lucky in my career to be able to go through the whole of physics. My PhD was in optics and solid-state physics, then I later moved to nuclear and particle physics. So I’ve had this fantastic privilege. I still believe in the unity of physics in spite of all the specialisation that exists today. I am glad to have seen all of the highlights. Witnessing the discovery of parity violation while I was working in nuclear physics was one.

How do you see the future of curiosity-driven research, and of CERN?

The future of high-energy physics is to combine with astrophysics, because the real big questions now are things like dark matter and dark energy. This has already been done in a sense. Originally the idea in particle physics was to investigate the micro-cosmos; now we find out that measuring the micro-cosmos means investigating matter under conditions that existed nanoseconds after the Big Bang. Of course, many questions remain in particle physics itself, like neutrinos, matter–antimatter inequality and the real unification of the forces.

I was advised by some senior colleagues to resign because it was not possible to build LEP on a constant budget

With LEP and the LHC, the number of outside users who build and operate the experiments increased drastically, so the physics competence now rests to a large extent with them. CERN’s competence is mainly new technology, both for experiments and accelerators. At LEP, cheap “concrete” instead of iron magnets were used to save on investment, and coupled RF cavities to use power more efficiently were invented, and later complemented by superconducting cavities. New detector technologies following the CERN tradition of Charpak turned the LEP experiments into precision ones. This line was followed by the LHC, with the first large-scale use of high-field superconducting magnets and superfluid-helium cooling technology. Whatever happens in elementary particle physics, technology will remain one of CERN’s key competences. Above and beyond elementary particle physics, CERN has become such a symbol and big success for Europe, and a model for worldwide international cooperation, that it is worth a large political effort to guarantee its long-term future.

The post Lessons from LEP appeared first on CERN Courier.

]]>
Opinion The Large Electron Positron collider changed particle physics forever. As the field eyes up the next major collider, former CERN Director-General Herwig Schopper describes what it took to make LEP happen. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_Interview_Schopper2.jpg
Memories from Caltech https://cerncourier.com/a/memories-from-caltech/ Thu, 11 Jul 2019 09:54:23 +0000 https://preview-courier.web.cern.ch?p=83642 Stephen Wolfram reflects on Gell-Mann’s complex character and his rivalry with Richard Feynman.

The post Memories from Caltech appeared first on CERN Courier.

]]>

In the mid-1970s, particle physics was hot. Quarks were in. Group theory was in. Field theory was in. And so much progress was being made that it seemed like the fundamental theory of physics might be close at hand. Right in the middle of all this was Murray Gell-Mann – responsible for not one, but most, of the leaps of intuition that had brought particle physics to where it was. There’d been other theories, but Murray’s, with their somewhat elaborate and abstract mathematics, were always the ones that seemed to carry the day.

It was the spring of 1978 and I was 18 years old. I’d been publishing papers on particle physics for a few years. I was in England, but planned to soon go to graduate school in the US, and was choosing between Caltech and Princeton. One weekend afternoon, the phone rang. “This is Murray Gell-Mann”, the caller said, then launched into a monologue about why Caltech was the centre of the universe for particle physics at the time. Perhaps not as star-struck as I should have been, I asked a few practical questions, which Murray dismissed. The call ended with something like, “Well, we’d like to have you at Caltech”.

I remember the evening I arrived, wandering around the empty fourth floor of Lauritsen Lab – the centre of Caltech theoretical particle physics. There were all sorts of names I recognised on office doors, and there were two offices that were obviously the largest: “M. Gell-Mann” and “R. Feynman”. In between them was a small office labelled “H. Tuck”, which by the next day I’d realised was occupied by the older but very lively departmental assistant.

I never worked directly with Murray but I interacted with him frequently while I was at Caltech. He was a strange mixture of gracious and gregarious, together with austere and combative. He had an expressive face, which would wrinkle up if he didn’t approve of what was being said. Murray always grouped people and things he approved of, and those he didn’t – to which he would often give disparaging nicknames. (He would always refer to solid- state physics as “squalid-state physics”.) Sometimes he would pretend that things he did not like simply did not exist. I remember once talking to him about something in quantum field theory called the beta function. His face showed no recognition of what I was talking about, and I was getting slightly exasperated. Eventually I blurted out, “But, Murray, didn’t you invent this?” “Oh”, he said, suddenly much more charming, “You mean g times the psi function. Why didn’t you just say that? Now I understand.”

I could never quite figure out what it was that made Murray impressed by some people and not others. He would routinely disparage physicists who were destined for great success, and would vigorously promote ones who didn’t seem so promising, and didn’t in fact do well. So when he promoted me, I was on the one hand flattered, but on the other hand concerned about what his endorsement might really mean.

Feynman interactions

The interaction between Murray Gell-Mann and Richard Feynman was an interesting thing to behold. Both came from New York, but Feynman relished his “working class” New York accent while Gell-Mann affected the best pronunciation of words from any language. Both would make surprisingly childish comments about the other. I remember Feynman insisting on telling me the story of the origin of the word “quark”. He said he’d been talking to Murray one Friday about these hypothetical particles, and in their conversation they’d needed a name for them. Feynman told me he said (no doubt in his characteristic accent), “Let’s call them ‘quacks’”. The next Monday, he said, Murray came to him very excited and said he’d found the word “quark” in a novel by James Joyce. In telling this to me, Feynman then went into a long diatribe about how Murray always seemed to think the names for things were so important. “Having a name for something doesn’t tell you a damned thing,” Feynman said. Feynman went on, mocking Murray’s concern for things like what different birds are called. (Murray was an avid bird watcher.) Meanwhile, Feynman had worked on particles that seemed (and turned out to be) related to quarks. Feynman had called them “partons”. Murray insisted on always referring to them as “put-ons”.

He was a strange mixture of gracious and gregarious

Even though in terms of longstanding contributions to particle physics, Murray was the clear winner, he always seemed to feel as if he was in the shadow of Feynman, particularly with Feynman’s showmanship. When Feynman died, Murray wrote a rather snarky obituary, saying of Feynman: “He surrounded himself with a cloud of myth, and he spent a great deal of time and energy generating anecdotes about himself.” I never quite understood why Murray – who could have gone to any university in the world – chose to work at Caltech for 33 years in an office two doors down from Feynman.

Murray cared a lot about what people thought of him, but wasn’t particularly good at reading other people. Yet, alongside the brush-offs and the strangeness, he could be personally very gracious. I remember him inviting me several times to his house. He also did me quite a few favours in my career. I don’t know if I would call Murray a friend, though, for example, after his wife Margaret died, he and I would sometimes have dinner together, at random restaurants around Pasadena. It wasn’t so much that I felt of a different generation from him (which of course I was). It was more that he exuded a certain aloof tension, that made one not feel very sure about what the relationship really was.

Murray Gell-Mann had an amazing run. For 20 years he had made a series of bold conjectures about how nature might work – strangeness, V-A theory, SU(3), quarks, QCD – and in each case he had been correct, while others had been wrong. He had one of the more remarkable records of repeated correct intuition in the whole history of science.

He tried to go on. He talked about “grand unification being in the air”, and (along with many other physicists) discussed the possibility that QCD and the theory of weak interactions might be unified in models based on groups such as SU(5) and SO(10). He considered supersymmetry. But quick validations of these theories didn’t work out, though even now it’s still conceivable that some version of them might be correct.

I have often used Murray as an example of the challenges of managing the arc of a great career. From his twenties to his forties, Murray had the golden touch. His particular way of thinking had success after success, and in many ways he defined physics for a generation. By the time I knew him, the easy successes were over. Perhaps it was Murray; more likely, it was just that the easy pickings from his approach were now gone. He so wanted to succeed as he had before, not just in physics but in other fields and endeavours. But he never found a way to do it – and always bore the burden of his early success.

Though Murray is now gone, the physics he discovered will live on, defining an important chapter in the quest for our understanding of the fundamental structure of our universe. 

• This article draws on a longer tribute published on www.stephenwolfram.com.

The post Memories from Caltech appeared first on CERN Courier.

]]>
Feature Stephen Wolfram reflects on Gell-Mann’s complex character and his rivalry with Richard Feynman. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_caltech.jpg
We’ve been here before… https://cerncourier.com/a/weve-been-here-before/ Thu, 11 Jul 2019 09:28:29 +0000 https://preview-courier.web.cern.ch?p=83637 Tales of colliders contained in 60 illustrious years of CERN Courier offer a rich perspective on the strategic decisions facing the field today.

The post We’ve been here before… appeared first on CERN Courier.

]]>

In April 1960, Prince Philip, husband of Queen Elizabeth II, piloted his Heron airplane to Geneva for an informal visit to CERN. Having toured the laboratory’s brand new “25 GeV” Proton Synchrotron (PS), he turned to his host, president of the CERN Council François de Rose, and struck at the heart of fundamental exploration: “What have you got in mind for the future? Having built this machine, what next?” he asked. De Rose replied that this was a big problem for the field: “We do not really know whether we are going to discover anything new by going beyond 25 GeV,” he said. Unbeknown to de Rose and everyone else at that time, the weak gauge bosons and other phenomena that would transform particle physics were lying not too far above the energy of the PS.

This is a story repeated in elementary particle physics, and which CERN Courier, celebrating its 60th anniversary this summer, offers a bite-sized glimpse of.

The first issue of the Courier was published in August 1959, just a few months before the PS switched on, at a time when accelerators were taking off. The PS was the first major European machine, quickly reaching an energy of 28 GeV, only to be surpassed the following year by Brookhaven’s Alternating Gradient Synchrotron. The March 1960 issue of the Courier described a meeting at CERN where 245 scientists from 28 countries had discussed “a dozen machines now being designed or constructed”. Even plasma-based acceleration techniques – including a “plasma betatron” at CERN – were on the table.

A time gone by

The picture is not so different today (see Granada symposium thinks big), though admittedly thinner on projects under construction. Some things remain eerily pertinent: swap “25 GeV” for “13 TeV” in de Rose’s response to Prince Philip, and his answer still stands with respect to what lies beyond the LHC’s energy. Other things are of a time gone by. The third issue of the Courier, in October 1959, proudly declared that “elementary particles number 32” (by 1966 that number had grown to more than 50 – see “Not so elementary”). Another early issue likened the 120 million Swiss Franc cost of the PS to “10 cigarettes for each of the 220 million inhabitants of CERN’s 12 Member States”.

The general situation of elementary particle physics back then, argued the August 1962 issue, could be likened to atomic physics in 1924 before the development of quantum mechanics. Summarising the 1962 ICHEP conference held at CERN, which attracted an impressive 450 physicists from 158 labs in 39 countries, the leader of the CERN theory division Léon Van Hove wrote: “The very fact that the variety of unexpected findings is so puzzling is a promise that new fundamental discoveries may well be in store at the end of a long process of elucidation.” Van Hove was right, and the 1960s brought the quark model and electroweak theory, laying a path to the Standard Model. Not that this paradigm shift is much apparent when flicking through issues of the Courier from the period; only hindsight can produce the neat logical history that most physics students learn.

Within a few years of PS operations, attention soon turned to a machine for the 1970s. A report on the 24th session of the CERN Council in the July 1963 issue noted ECFA’s recommendation that high priority be given to the construction in Europe of two projects: a pair of intersecting storage rings (ISR, which would become the world’s first hadron collider) and a new proton accelerator of a very high energy “probably around 300 GeV”, which would be 10 times the size of the PS (and eventually renamed the Super Proton Synchrotron, SPS). Mervyn Hine of the CERN directorate for applied physics outlined in the August 1964 issue how this so-called “Summit program” should be financed. He estimated the total annual cost (including that of the assumed national programmes) to be about 1100 million Swiss Francs by 1973, concluding that this was in step with a minimum growth for total European science. He wrote boldly: “The scientific case for Europe’s continuing forcefully in high-energy physics is overwhelming; the equipment needed is technically feasible; the scientific manpower needed will be available; the money is trivial. Only conservatism or timidity will stop it.”

The development of science

Similar sentiments exist now in view of a post-LHC collider. There is also nothing new, as the field grows ever larger in scale, in attacks on high-energy physics from outside. In an open letter published in the Courier in April 1964, nuclear physicist Alvin Weinberg argued that the field had become “remote” and that few other branches of science were “waiting breathlessly” for insights from high-energy physics without which they could not progress. Director-General Viki Weisskopf, writing in April 1965, concluded that the development of science had arrived at a critical stage: “We are facing today a situation where it is threatened that all this promising research will be slowed down by constrained financial support of high-energy physics.”

Deciding where to build the next collider and getting international partners on board was also no easier in the past, if the SPS was any guide. The September 1970 issue wrote that the “present impasse in the 300 GeV project” is due to the difficulty of selecting a site and: “At the same time it is disturbing to the traditional unity of CERN that only half the Member States (Austria, Belgium, Federal Republic of Germany, France, Italy and Switzerland) have so far adopted a positive attitude towards the project.” That half-a-century later, the SPS, soon afterwards chosen to be built at CERN, would be feeding protons into a 27 km-circumference hadron collider with a centre-of-mass energy of 13 TeV was unthinkable.

A giant LEP for mankind

An editorial in the January/February 1990 issue of the Courier titled “Diary of a dramatic decade” summed up a crucial period that had the Large Electron Positron (LEP) collider at its centre: Back in 1980, it said, the US was the “mecca” of high-energy physics. “But at CERN, the vision of Carlo Rubbia, the invention of new beam ‘cooling’ techniques by Simon van der Meer, and bold decisions under the joint Director-Generalship of John Adams and Léon Van Hove had led to preparations for a totally new research assault – a high-energy proton–antiproton collider.” The 1983 discoveries of the W and Z bosons had, it continued, “nudged the centroid of particle physics towards Europe,” and, with LEP and also HERA at DESY operating, Europe was “casting off the final shackles of its war-torn past”.

Despite involving what at that time was Europe’s largest civil-engineering project, LEP didn’t appear to attract much public attention. It was planned to be built within a constant CERN budget, but there were doubts as to whether this was possible (see Lessons from LEP). The September 1983 issue reported on an ECFA statement noting that reductions in CERN’s budget had put is research programme under “severe stress”, impairing the lab’s ability to capitalise on its successful proton–antiproton programme. “The European Laboratories have demonstrated their capacity to lead the world in this field, but the downward trend of support both for CERN and in the Member States puts this at serious risk,” it concluded. At the same time, following a famous meeting in Lausanne, the ECFA report noted that proton–proton collision energies of the order of 20 TeV could be reached with superconducting magnets in the LEP tunnel and “recommends that this possibility be investigated”.

Physicists were surprisingly optimistic about the possibility of such a large hadron collider. In the October 1981 issue, Abdus Salam wrote: “In the next decade, one may envisage the possible installation of a pp̅ collider in the LEP tunnel and the construction of a supertevatron… But what will happen to the subject 25 years from now?” he asked. “Accelerators may become as extinct as dinosaurs unless our community takes heed now and invests efforts on new designs.” Almost 40 years later, the laser-based acceleration schemes that Salam wrote of, and others, such as muon colliders, are still being discussed.

Accelerator physicist Lee Teng, in an eight-page long report about the 11th International Conference on High Energy Accelerators in the September 1980 issue, pointed out that seven decades in energy had been mastered in 50 years of accelerator construction. Extrapolating to the 21st century, he envisaged “a 20 TeV proton synchrotron and 350 GeV electron–positron linear colliders”. On CERN’s 50th anniversary in September 2004, former Director-General Luciano Maiani predicted what the post-2020 future might look like, asserting that “a big circular tunnel, such as that required by a Very Large Hadron Collider, would have to go below Lake Geneva or below the Jura (or both). Either option would be simply too expensive to consider. This is why a 3–5 TeV Compact Linear Collider (CLIC) would be the project of choice for the CERN site.” It is the kind of decision that the current CERN management is weighing up today, 15 years later.

Driving success

This collider-centric view of 60 years of CERN Courier does little justice to the rest of the magazine’s coverage of fixed-target physics, neutrino physics, cosmology and astrophysics, detector and accelerator physics, computing, applications, and broader trends in the field. It is striking how much the field has advanced and specialised. Equally, it is heartening to find so many parallels with today. Some are sociological: in October 1995 a report on an ECFA study noted “much dissatisfaction” with long author lists and practical concerns about the size of the even bigger LHC-experiment collaborations over the horizon. Others are more strategic.

It is remarkable to read through back issues of the Courier from the mid-1970s to find predictions for the masses of the W and Z bosons that turned out to be correct to within 15%. This drove the success of the Spp̅S and LEP programmes and led naturally to the LHC – the collider to hunt down the final piece of the electroweak jigsaw, the “so-called Higgs mesons” as a 1977 issue of the Courier put it. Following the extraordinary episode that was the development and completion of the Standard Model, we find ourselves in a similar position as we were in the PS days regarding what lies over the energy frontier. Looking back at six decades of fundamental exploration as seen through the imperfect lens of this magazine, it would take a bold soul to claim that it isn’t worth a look.

The post We’ve been here before… appeared first on CERN Courier.

]]>
Feature Tales of colliders contained in 60 illustrious years of CERN Courier offer a rich perspective on the strategic decisions facing the field today. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_CCat60_frontis_main.jpg
Strong interactions https://cerncourier.com/a/strong-interactions/ Thu, 11 Jul 2019 09:03:29 +0000 https://preview-courier.web.cern.ch?p=83633 Harald Fritzsch, who collaborated with Gell-Mann in the early 1970s, describes the steps that led to a full understanding of strong interactions.

The post Strong interactions appeared first on CERN Courier.

]]>

Murray Gell-Mann’s scientific career began at the age of 15, when he received a scholarship from Yale University that allowed him to study physics. Afterwards he went to the Massachusetts Institute of Technology and worked under Victor Weisskopf. He completed his PhD in 1951, at the age of 21, and became a postdoc at the Institute for Advanced Study in Princeton.

The following year, Gell-Mann joined the research group of Enrico Fermi at the University of Chicago. He was particularly interested in the new particles that had been discovered in cosmic rays, such as the six hyperons and the four K-mesons. Nobody understood why these particles were created easily in collisions of nucleons, yet decayed rather slowly. To understand the peculiar properties of the new hadrons, Gell-Mann introduced a quantum number, which he called strangeness (S): nucleons were assigned S = 0; the Λ hyperon and the three Σ hyperons were assigned S = (–1); the two Ξ hyperons had S = (–2); and the negatively charged K-meson had S = (–1).

Strange assumptions

Gell-Mann assumed that the strangeness quantum number is conserved in the strong and electromagnetic interactions, but violated in the weak interactions. The decays of the strange particles into particles without strangeness could only proceed via the weak interaction.

The idea of strangeness thus explained, in a simple way, the production and decay rates of the newly discovered hadrons. A new particle with S = (–1) could be produced by the strong interaction together with a particle with S = (+1) – e.g. a negatively charged Σ can be produced together with a positively charged K meson. However, a positively charged Σ could not be produced together with a negatively charged K meson, since both particles have S = (–1).

In 1954 Gell-Mann and Francis Low published details of the renormalisation of quantum electrodynamics (QED). They had introduced a new method called the renormalisation group, which Kenneth Wilson (a former student of Gell-Mann) later used to describe the phase transitions in condensed-matter physics. Specifically, Gell-Mann and Low calculated the energy dependence of the renormalised coupling constant. In QED the effective coupling constant increases with the energy. This was measured at the LEP collider at CERN, and found to agree with the theoretical prediction.

In 1955 Gell-Mann went to Caltech in Pasadena, on the invitation of Richard Feynman, and was quickly promoted to full professor – the youngest in Caltech’s history. In 1957, Gell-Mann started to work with Feynman on a new theory of the weak interaction in terms of a universal Fermi interaction given by the product of two currents and the Fermi constant. These currents were both vector currents and axial-vector currents, and the lepton current is a product of a charged lepton field and an antineutrino field. The “V–A” theory showed that since the electrons emitted in a beta-decay are left-handed, the emitted antineutrinos are right-handed – thus parity is not a conserved quantum number. Some experiments were in disagreement with the new theory. Feynman and Gell-Mann suggested in their paper that these experiments were wrong, and it turned out that this was the case.

In 1960 Gell-Mann invented a new symmetry to describe the new baryons and mesons found in cosmic rays and in various accelerator experiments. He used the unitary group SU(3), which is an extension of the isospin symmetry based on the group SU(2). The two nucleons and the six hyperons are described by an octet representation of SU(3), as are the eight mesons. Gell-Mann often described the SU(3)-symmetry as the “eightfold way” in reference to the eightfold path of Buddhism. At that time, it was known that there exist four Δ resonances, three Σ resonances and two χ resonances. There is no SU(3)-representation with nine members, but there is a decuplet representation with 10 members. Gell-Mann predicted the existence and the mass of a negatively charged 10th particle with strangeness S = (–3), which he called the Ω particle.

The Ω is unique in the decuplet: due to its strangeness it could only decay by the weak interaction, and so would have a relatively long lifetime. This particle was discovered in 1964 by Nicholas Samios and his group at Brookhaven National Laboratory, at the mass Gell-Mann had predicted. The SU(3) symmetry was very successful and in 1969 Gell-Mann received the Nobel Prize in Physics “for his contributions and discoveries concerning the classification of elementary particles and their interactions.”

In 1962 Gell-Mann proposed the algebra of currents, which led to many sum rules for cross sections, such as the Adler sum rule. Current algebra was the main topic of research in the following years and Gell-Mann wrote several papers with his colleague Roger Dashen on the topic.

Quark days

In 1964 Gell-Mann discussed the triplets of SU(3), which he called “quarks”. He proposed that quarks were the constituents of baryons and mesons, with fractional electric charges, and published his results in Physics Letters. Feynman’s former PhD student George Zweig, who was working at CERN, independently made the same proposal. But the quark model was not considered seriously by many physicists. For example, the Ω is a bound state of three strange quarks placed symmetrically in an s-wave, which violated the Pauli principle since it was not anti-symmetric. In 1968, quarks were found indirectly in deep-inelastic electron–proton experiments performed at SLAC.

By then it had been proposed, by Oscar Greenberg and by Moo-Young Han and Yoichiro Nambu, that quarks possess additional properties that keep the Pauli principle intact. By imagining the quarks in three “colours” – which later came to be called red, green and blue – hadrons could be considered as colour singlets, the simplest being the bound states of a quark and an antiquark (meson) or of three quarks (baryon). Since baryon wave functions are antisymmetric in the colour index, there is no problem with the Pauli principle. Taking the colour quantum number as a gauge quantum number, like the electric charge in QED, yields a gauge theory of the strong interactions: colour symmetry is an exact symmetry and the gauge bosons are massless gluons, which transform as an octet of the colour group. Nambu and Han had essentially arrived at quantum chromodynamics (QCD), but in their model the quarks carried integer electrical charges.

The quark model was not considered seriously by many physicists

I was introduced to Gell-Mann by Ken Wilson in 1970 at the Aspen Center of Physics, and we worked together for a period. In 1972 we wrote down a model in which the quarks had fractional charges, proposing that, since only colour singlets occur in the spectrum, fractionally charged quarks remain unobserved. The discovery in 1973 by David Gross, David Politzer and Frank Wilczek that the self-interaction of the gluons leads to asymptotic freedom – whereby the gauge coupling constant of QCD decreases if the energy is increased – showed that quarks are forever confined. It was rewarded with the 2004 Nobel Prize in Physics, although a rigorous proof of quark confinement is still missing.

Gell-Mann did not just contribute to the physics of strong interactions. In 1979, along with Pierre Ramond and Richard Slansky, he wrote a paper discussing details of the “seesaw mechanism” – a theoretical proposal to account for the very small values of the neutrino masses introduced a couple of years earlier. After 1980 he also became interested in string theory. His wide-ranging interests in languages, and other areas beyond physics are also well documented.

I enjoyed working with Murray Gell-Mann. We had similar interests in physics, and we worked together until 1976 when I left Caltech and went to CERN. He visited often. In May 2019, during a trip to Los Alamos Laboratory, I was fortunate to have had the chance to visit Murray at his house in Santa Fe one last time. 

Further memories of Gell-Mann can be found at Gell-Mann’s multi-dimensional genius and Memories from Caltech.

The post Strong interactions appeared first on CERN Courier.

]]>
Feature Harald Fritzsch, who collaborated with Gell-Mann in the early 1970s, describes the steps that led to a full understanding of strong interactions. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_interactions1.jpg
Gell-Mann’s multi-dimensional genius https://cerncourier.com/a/gell-manns-multi-dimensional-genius/ Thu, 11 Jul 2019 08:54:52 +0000 https://preview-courier.web.cern.ch?p=83628 Murray Gell-Mann was one of the great geniuses of the 20th century, says Lars Brink, and stands out among other Nobel laureates.

The post Gell-Mann’s multi-dimensional genius appeared first on CERN Courier.

]]>

One of the 20th century’s most amazing brains has stopped working. Nobel laureate Murray Gell-Mann died on 24 May at the age of 89. It is impossible to write a complete obituary of him, since he had so many dimensions that some will always be forgotten or neglected.

Murray was the leading particle theorist in the 1950s and 1960s in a field that had attracted the brightest young stars of the post-war generation. But he was also a polyglot who could tell you any noun in at least 25 languages, a walking encyclopaedia, a nature lover and a protector of endangered species, who knew all the flowers and birds. He was an early environmentalist, but he was so much more. It has been one of the biggest privileges in my life to have worked with him and to have been a close friend of his.

Murray Gell-Mann was born into a Jewish immigrant family in New York six weeks before the stock-market crash of October 1929. He was a trailing child, with a brother who was nine years older and relatively aged parents. He used to joke that he had been born by accident. His father had failed his studies and, after Murray’s birth, worked as a guard in a bank vault. Murray was never particularly close to father, but often talked about him.

Child prodigy

According to family legend, the first words that Murray spoke were “The lights of Babylon”, when he was looking at the night sky over New York at the age of two. At three, he could read and multiply large numbers in his head. At five he could correct older people about their language and in discussions. His interest for numismatics had already begun: when a friend of the family showed him what he claimed was a coin from Emperor Tiberius’ time, Murray corrected the pronunciation and said it was not from that time. At the age of seven, he participated in – and won – a major annual spelling competition in New York for students up to the age of 12. The last word that only he could spell and explain was “subpoena”, also citing its Latin origins and correcting the pronunciation of the moderator.

By the age of nine he had essentially memorised the Encyclopaedia Britannica. The task sounds impossible, but some of us did a test behind his back once in the 1970s. The late Myron Bander had learnt and studied an incomprehensible word and steered the discussion on to it over lunch. Of course Murray knew what the word was. He even recalled the previous and subsequent entries on the page.

Murray’s parents didn’t know what to do with him, but his piano teacher (music was not his strong side) made them apply for a scholarship so that he could start at a good private school. He was three years younger than his classmates, yet they always looked to him to see if he approved of what the teachers said. His tests were faultless, except for the odd exception. Once he came home and had scored “only” 97%, to which his father said: How could you miss this? His brother, who was more “normal”, was a great nature lover and became a nature photographer and later a journalist. He taught Murray about birds and plants, which would become a lifelong passion.

At the age of 15, he finished high school and went to Yale. He did not know which subject he would choose as a major, since he was interested in so many subjects. It became physics, partly to please his father who had insisted on engineering such that he could get a good job. He then went to MIT for his doctoral studies, receiving the legendary Victor “Viki” Weisskopf as his advisor. Murray wanted to do something pioneering, but he didn’t succeed. He tried for a whole semester and at the same time studied Chinese and learnt enough characters to read texts. He finally decided to present a thesis in nuclear physics, which was approved but that he never wanted to talk about. When Weisskopf, later in life, was asked what his biggest contribution to physics was, he answered: “Murray Gell-Mann”.

At the age of 21 Murray was ready to fly and went to the Institute for Advanced Study (IAS) as one of Robert Oppenheimer’s young geniuses. In the next year he went to the University of Chicago under Enrico Fermi, first as an instructor and in a few years became an associate professor. Even though he had not yet produced outstanding work, when he came to Chicago he was branded as a genius. At the IAS he had started to work on particle physics. He collaborated with Francis Low on renormalisation and realised that the coupling constant in a renormalisable quantum field theory runs with energy. As would happen so often, he procrastinated with the publication until 1954, by which time Petermann and Stückelberg had published this result.

This was during the aftermath of QED and Gell-Mann wanted to attack the strong interactions. He started his odyssey to classify all the new particles and introduced the concept of “strangeness” to specify the kaons and the corresponding baryons. This was also done independently by Kazuhiko Nishijima. When he was back at the IAS in 1955, Murray solved the problem with KL and Ks, the two decay modes of the neutral kaons in modern language (better known as the τθ puzzle). According to him, he showed this to Abraham Pais who said, “Why don’t we publish it?”, which they did. They were never friends after that. Murray also once told me that this was the hardest problem that he had solved.

A cavalcade of results

Aged 26, he lectured at Caltech on his renormalisation and kaon work. Richard Feynman, who was the greatest physicist at the time, said that he thought he knew everything, but these things he did not know. Feynman immediately said that Murray had to come to Caltech and dragged him to the dean. A few weeks later, he was a full professor. A large cavalcade of new results began to come out. Because he had difficulty relinquishing his works, they numbered just a few a year. But they were like cathedrals, with so many new details that he came to dominate modern particle physics.

After the ground-breaking work of T D Lee and C N Yang on parity violation in the weak interactions, Gell-Mann started to work on a dynamical theory – as did Feynman. In the end the dean of the faculty forced them to publish together, and the V–A theory was born. George Sudarshan and Robert Marshak also published the same result, and there was a long-lasting fight about who had told who before. Murray’s part of the paper, which is the second half, is also a first sketch of the Standard Model, and every sentence is worth reading carefully. It takes students of exegetics to unveil all the glory of Murray’s texts. Murray was to physics writing what Joseph Conrad was to novel writing!

Sometimes there are people born with all the neurons in the right place

Murray then turned back to the strong interactions and, with Maurice Lévy, developed the non-linear sigma model for pion physics to formulate the partially conserved axial vector current (PCAC). This was published within days of Yoichiro Nambu’s ground-breaking paper where he understood pion physics and PCAC in terms of spontaneous breaking of the chiral symmetry. In a note added to the proof they introduced a “funny” angle to describe the decay of 14O, which a few years later became the Cabibbo angle in Nicola Cabibbo’s universal treatment of the weak interactions.

Gell-Mann then made the great breakthrough when he classified the strongly interacting particles in terms of families of SU(3), a discovery also made by Yuval Ne’eman. The paper was never published in a journal and he used to joke that one day he would find out who rejected it. With this scheme he could predict the existence of the triply strange Ω baryon, which was discovered in 1964 right where he predicted it would be. It paved the way for Gell-Mann’s suggestion in 1963 that all the baryons were made up of three fundamental particles, which in the published form he came to call quarks, after a line in James Joyce’s Finnegans Wake, “three quarks for Muster Mark”. The same idea was also put forward by George Zweig who called them “aces”. It was a very difficult thing for Murray to propose such a wild idea, and he formulated it extremely carefully to leave all doors open. Again, his father’s approval loomed in the background.

With the introduction of current algebra he had laid the ground for the explosion in particle theory during the 1970s. In 1966, Weisskopf’s 60th birthday was celebrated, and somehow Murray failed to show up. When he later received the proceedings, he was so ashamed that he did not open it. Had he done so, he would have found Nambu’s suggestion of a non-abelian gauge field theory with coloured quarks for the strong interactions. Nambu did not like fractional charges so he had given the quarks integer charges. Murray later said that, had he read this paper, he would have been able to formulate QCD rather quickly.

Legacy

When, at the age of 40 in 1969, he received the Nobel Prize in Physics as the sole recipient, he had been a heavily nominated candidate for the previous decade. Next year the Nobel archives for this period will be open, and scholars can study the material leading up to the prize. Unfortunately, his father had died a few weeks before the prize announcement. Murray once said to me, “If my father had lived two weeks longer, my life would have been different.”

During the 1950s and 1960s Gell-Mann had often been described in the press as the world’s most intelligent man. With a Nobel Prize in his pocket, the attraction to sit on various boards and committees became too strong to resist. His commitment to conserving endangered species also took up more of his time. Murray had also become a great collector of pre-Columbian artefacts and these were often expensive and difficult to obtain.

In the 1970s, he was displaced from the throne by people from the next generation. Murray was still the one invited to give the closing lectures at major conferences, but his own research started to suffer somewhat. In the mid-1970s, I came to Caltech as a young postdoctoral fellow. I had met him in a group before, but trembled like an aspen leaf when I first met him there. He had, of course, found out from where in Sweden I came and pronounced my name just right, and demanded that everyone else in the group do so. Pierre Ramond also arrived as a postdoc at that time, having been convinced by Murray to leave his position at Yale. After a few months we started to work together on supergravity. We did the long calculations, since Murray was often away. But he always contributed and could spot any weak links in our work immediately. Once, when we were in the middle of solving a problem after a period of several days, he came in and looked at what we did and wrote the answer on the board. Two days later we came to exactly that result. John Schwarz, who was a world champion in such calculations, was impressed and humbled.

When I left Caltech I got a carte blanche from Murray to return as often as I wanted, during which I worked with Schwarz and Michael Green developing string theory. Murray was always very positive about our work, which few others were. It was entirely thanks to him that we could develop the theory. Eventually, I couldn’t go to the US quite as often. Murray had also lost his wife in the early 1980s and never really recovered from this. In the mid-1980s he got the chance to set up a new institute in Santa Fe, which became completely interdisciplinary. He loved nature in New Mexico and here he could work on the issues that he now preferred, such as linguistics and large-scale order in nature. He dropped particle physics but was always interested in what happened in the field. Edward Witten had taken over the leadership of fundamental physics and Murray could not compete there.

Being considered the world’s most intelligent person did not make Murray very happy. He had trouble finding real friends among his peers. They were simply afraid of him. I often saw people looking away. The post-war research world is a single great world championship. For us who were younger, it was so obvious that he was intellectually superior to us that we were not disturbed by it. All the time, though, the shadow of his father was sitting on his shoulder, which led him too often to show off when he did not need to.

Sometimes people are born with all the neurons in the right place. We sometimes hear about the telephone-directory geniuses or people who know railway schedules by heart, but who otherwise are intellectually normal, if not rather weak. The fact that a few of them every century also get the neurons to make them intellectually superior is amazing. Among all Nobel laureates in physics, Murray Gell-Mann stands out. Others have perhaps done just as much in their research in physics and may be remembered longer, but I do not think that anyone had such a breadth in their knowledge. John von Neumann, the Hungarian–American mathematician who, among other things, was the first to construct a computer was another such universal genius. He could show off knowing Goethe by heart and on his death bed he cited the first sentence on each page of Faust for his brother. Murray was certainly a pain for American linguists, as he could say so many words in so many languages that he could always gain control over a discussion.

There are so many more stories that I could tell. Once he told me “Just think what I could have done if I had worked more with physics.” His almost crazy interest in so many areas took a lot of time away from physics. But he will still be remembered, I hope, as one of the great geniuses of the 20th century.

The post Gell-Mann’s multi-dimensional genius appeared first on CERN Courier.

]]>
Feature Murray Gell-Mann was one of the great geniuses of the 20th century, says Lars Brink, and stands out among other Nobel laureates. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_News-genius2.jpg
Artistic encounters of the quantum kind https://cerncourier.com/a/review-quantica/ Wed, 08 May 2019 14:10:39 +0000 https://preview-courier.web.cern.ch?p=83131 Quàntica, which opened on 9 April at the Centre de Cultural Contemporània de Barcelona, invites you to explore quantum physics through the lens of both art and science.

The post Artistic encounters of the quantum kind appeared first on CERN Courier.

]]>

Take a leap and enter, past the chalkboard wall filled with mathematical equations written, erased and written again, into the darkened room of projected questions where it all begins. What is reality? How do we describe nature? And for that matter, what is science and what is art?

Quàntica, which opened on 9 April at the Centre de Cultural Contemporània de Barcelona, invites you to explore quantum physics through the lens of both art and science. Curated by Mónica Bello, head of Arts at CERN, and art curator José-Carlos Mariátegui, with particle physicist José Ignacio Latorre serving as its scientific adviser, Quàntica is the second iteration of an exhibition that brings together 10 artworks resulting from Collide International art residences at CERN.

The exhibition illustrates how interdisciplinary intersections can present scientific concepts regarded by the wider public as esoteric, in ways that bridge the gap, engage the senses and create meaning. Punctuating each piece is the idea that the principles of quantum physics, whether we like it or not, are pervasive in our lives today – from technological applications in smart phones and satellites to our philosophies and world views.

Nine key concepts – “scales”, “quantum states”, “overlap”, “intertwining”, “indeterminacy”, “randomness”, “open science”, “everyday quantum” and “change-evolution” – guide visitors through the meandering hallway. Each display point prompts pause to consider a question that underlies the fundamental principles of quantum physics. Juxtaposed in the shared space is an artist-made particle detector and parts of experiments displayed as artistic objects. Video art installations are interspersed with video interviews of CERN physicists, including Helga Timko, who asks: what if we were to teach children quantum physics at a very young age, would they perceive the world as we do? On the ceiling above is a projection of a spiral galaxy, a part of Juan Cortés’ Supralunar. Inspired by Vera Rubin’s work on dark matter and the rotational motion of galaxies, Cortés made a two-part multisensorial installation: a lens through which you see flashing lights and vibrating plates to rest your chin and experience, on some level, the intensity of a galaxy’s formation.

From the very large scale, move to the very small. A recording of Richard Feynman explaining the astonishing double-slit experiment plays next to a standing demonstration allowing you to observe the counterintuitive possibilities that exist at the subatomic level. You can put on goofy glasses for Lea Porsager’s Cosmic Strike, an artwork with a sense of humour, which offers an immersive 3D animation described as “hard science and loopy mysticism”. She engages the audience’s imagination to meditate on being a neutrino as it travels through the neutrino horn, one of the many scientific artefacts from CERN’s archives that pepper the path.

Around the corner is Erwin Schrödinger’s 1935 article where he first used the word “Verschränkung” (or entanglement) and Anton Zeilinger’s notes explaining the protocol for quantum teleportation. Above these is projected a scene from Star Trek, which popularised the idea of teleportation.

The most visually striking piece in the exhibition is Cascade by Yunchul Kim, made up of three live elements. The first part is Argos (see image), splayed metallic hands that hang like lamps from the ceiling – an operational muon detector made of 41 channels blinking light as it records the particles passing through the gallery. Each signal triggers the second element, Impulse, a chandelier-like fluid-transfer system that sends drops of liquid through microtubes that flow into transparent veins of the final element, Tubular. Kim, who won the 2016 Arts at CERN Collide International Award, is an artist who employs rigorous methods and experiments in his laboratory with liquid and materials. Cascade encapsulates the surprising results knowledge-sharing can yield.

Quàntica is a must-see for anyone who views art and science as opposite ends of the academic spectrum. The first version of the exhibition was held at Liverpool in the UK last year. Co-produced by the ScANNER network (CERN, FACT, CCCB, iMAL and Le Lieu Unique), the exhibition continues until 24 September in Barcelona, before travelling to Brussels.

The post Artistic encounters of the quantum kind appeared first on CERN Courier.

]]>
Review Quàntica, which opened on 9 April at the Centre de Cultural Contemporània de Barcelona, invites you to explore quantum physics through the lens of both art and science. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_Rev_quantica-1.jpg
Rutherford in three movements https://cerncourier.com/a/review-rutherford/ Wed, 08 May 2019 13:32:02 +0000 https://preview-courier.web.cern.ch?p=83128 Accompanied by physics historian John Campbell, the viewer learns about this great scientist from his ordinary childhood as a “Kiwi boy” to his untimely death in 1937.

The post Rutherford in three movements appeared first on CERN Courier.

]]>
Professor Radium, the Atom Splitter, the Crocodile. Each is a nickname pointing to Ernest Rutherford, who made history by explaining radioactivity, discovering the proton and splitting the atom. All his scientific and personal milestones are described in great detail in the three-part documentary Rutherford, produced by Spacegirls Production Ltd in 2011.

Accompanied by physics historian John Campbell, the viewer learns about this great scientist from his ordinary childhood as a “Kiwi boy” to his untimely death in 1937. Historical reconstructions and trips to the places (New Zealand, the UK and Canada) that characterised his life bring Rutherford back to life.

When it was still heresy to think that there existed objects smaller than an atom, Rutherford was exploring the secrets of the invisible. During his first stay in Cambridge (UK), he discovered that uranium emits two types of radiation, which he named alpha and beta. Then, continuing his research at McGill University (Canada), he discovered that radioactivity has to do with the instability of the atom. He was rewarded with the Nobel Prize in Chemistry in 1908, and called Professor Radium after a comic book character of that name. In those years, people did not know the effects of radiation and “radio-toothpaste” was available to buy.

Then in Manchester (UK), he conducted the first artificial-induced nuclear reaction and described a new model of the atom, where a proton is like a fly in the middle of an empty cathedral. He fired alpha particles at nitrogen gas and obtained oxygen plus hydrogen, thus the epithet of the world’s first “atom splitter”.

In-between these big discoveries, the documentary points out that Rutherford blew tobacco smoke into his ionisation chamber, providing the groundwork for modern smoke detectors, proposed a more accurate dating system for the Earth’s age based on the rate of decay of uranium atoms, and campaigned for women’s opportunities and saving scientists from war.

The name “Crocodile” came later, from soviet physicist Pyotr Kapitza, as it is an animal that never turns back – or perhaps a reference to Rutherford’s loud voice that preceded his visits. The carving of a crocodile on the outer wall of the Mond Laboratory at the Cavendish site, commissioned by Kapitza, still reminds Cambridge students and tourists of this outstanding physicist.

  • Spacegirls Production Ltd

The post Rutherford in three movements appeared first on CERN Courier.

]]>
Review Accompanied by physics historian John Campbell, the viewer learns about this great scientist from his ordinary childhood as a “Kiwi boy” to his untimely death in 1937. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_Reviews_ruther_th.jpg
Rutherford, transmutation and the proton https://cerncourier.com/a/rutherford-transmutation-and-the-proton/ Wed, 08 May 2019 09:43:56 +0000 https://preview-courier.web.cern.ch?p=83056 John Campbell recounts the events leading to Ernest Rutherford’s discovery of the proton, published in 1919.

The post Rutherford, transmutation and the proton appeared first on CERN Courier.

]]>

In his early days, Ernest Rutherford was the right man in the right place at the right time. After obtaining three degrees from the University of New Zealand, and with two years’ original research at the forefront of the electrical technology of the day, in 1895 he won an Exhibition of 1851 Science Scholarship, which took him to the Cavendish Laboratory at the University of Cambridge in the UK. Just after his arrival, the discoveries of X-rays and radioactivity were announced and J J Thomson discovered the electron. Rutherford was an immediate believer in objects smaller than the atom. His life’s work changed to understanding radioactivity and he named the alpha and beta rays.

In 1898 Rutherford took a chair in physics at McGill University in Canada, where he achieved several seminal results. He discovered radon, demonstrated that radio-activity was just the natural transmutation of certain elements, showed that alpha particles could be deviated in electric and magnetic fields (and hence were likely to be helium atoms minus two electrons), dated minerals and determined the age of the Earth, among other achievements.

In 1901, the McGill Physical Society called a meeting titled “The existence of bodies smaller than an atom”. Its aim was to demolish the chemists. Rutherford spoke to the motion and was opposed by a young Oxford chemist, Frederick Soddy, who was at McGill by chance. Soddy’s address “Chemical evidence for the indivisibility of the atom” attacked physicists, especially Thomson and Rutherford, who “… have been known to give expression to opinions on chemistry in general and the atomic theory in particular which call for strong protest.” Rutherford invited Soddy, who specialised in gas analysis, to join him. It was a short but fruitful collaboration in which the pair determined the first few steps in the natural transmutation of the heavy elements.

Manchester days

For some years Rutherford had wished to be more in the centre of research, which was Europe, and in 1907 moved to the University of Manchester. Here he began to follow up on experiments at McGill in which he had noted that a beam of alpha particles became fuzzy if passed through air or a thin slice of mica. They were scattered by an angle of about two degrees, indicating the presence of electric fields of 100 MV/cm, prompting his statement that “the atoms of matter must be the seat of very intense electrical forces”.

At Manchester he inherited an assistant, Hans Geiger, who was soon put to work making accurate measurements of the number of alpha particles scattered by a gold foil over these small angles. Geiger, who trained the senior undergraduates in radioactive techniques, told Rutherford in 1909 that one, Ernest Marsden, was ready for a subject of his own. Everyone knew that beta particles could be scattered off a block of metal, but no one thought that alpha particles would be. So Rutherford told Marsden to examine this. Marsden quickly found that alpha particles are indeed scattered – even if the block of metal was replaced by Geiger’s gold foils. This was entirely unexpected. It was, as Rutherford later declared, as if you fired a 15 inch naval shell at a piece of tissue paper and it came back and hit you.

One day, a couple of years later, Rutherford exclaimed to Geiger that he knew what the atom looked like: a nuclear structure with most of the mass and all of one type of charge in a tiny nucleus only a thousandth the size of an atom. This is the work for which he is most famous today, eight decades after his death (CERN Courier May 2011 p20).

Around 1913, Rutherford asked Marsden to “play marbles” with alphas and light atoms, especially hydrogen. Classical calculations showed that an alpha colliding head-on with a hydrogen nucleus would cause the hydrogen to recoil with a speed 1.6 times, and a range four times, that of the alpha particle that struck it. The recoil of the less-massive, less-charged hydrogen could be detected as lighter flashes on the scintillation screen at much greater range than the alphas could travel. Marsden indeed observed such long-range “H” particles, as he named them, produced in hydrogen gas and in thin films of materials rich in hydrogen, such as paraffin wax. He also noticed that the long-ranged H particles were sometimes produced when alpha particles travelled through air, but he did not know where they came from: water vapour in the gas, absorbed water on the apparatus or even emission from the alpha source, were suggested.

Mid-1914 bought an end to the collaboration. Marsden wrote up his work before accepting a job in New Zealand. Meanwhile, Rutherford had sailed to Canada and the US to give lectures, spending just a month back at Manchester before heading to Australia for the annual meeting of the British Association for the Advancement of Science. Three days before his arrival, war was declared in Europe.

Splitting the atom

Rutherford arrived back in Manchester in January 1915, via a U-boat-laced North Atlantic. It was a changed world, with the young men off fighting in the war. On behalf of the Admiralty, Rutherford turned his mind to one of the most pressing problems of the war: how to detect submarines when submerged. His directional hydrophone (patented by Bragg and Rutherford) was to be fitted to fleet ships. It was not until 1917 when Rutherford could return to his scientific research, specifically alpha-particle scattering from light atoms. By December of that year, he reported to Bohr that “I am also trying to break up the atom by this method. – Regard this as private.”

He studied the long-range hydrogen-particle recoils in several media (hydrogen gas, solid materials with a lot of hydrogen present and gases such as CO2 and oxygen), and was surprised to find that the number of these “recoil” particles increased when air or nitrogen was present. He deduced that the alpha particle had entered the nucleus of the nitrogen atom and a hydrogen nucleus was emitted. This marked the discovery that the hydrogen nucleus – or the proton, to give it the name coined by Rutherford in 1920– is a constituent of larger atomic nuclei.

Marsden was again available to help with the experiments for a few months from January 1919, whilst awaiting transport back to New Zealand after the war, and that year Rutherford accepted the position of director of the Cavendish Laboratory. Having delayed publication of the 1917 results until the war ended, Rutherford produced four papers on the light-atom work in 1919. In the fourth, “An anomalous effect in nitrogen.”, he wrote “we must conclude that the nitrogen atom disintegrated … and that the hydrogen atom which is liberated formed a constituent part of the nitrogen nucleus.” He also stated: “Considering the enormous intensity of the forces brought into play, it is not so much a matter of surprise that the nitrogen atom should suffer disintegration as that the α particle itself escapes disruption into its constituents”.

In 1920 Rutherford first proposed building up atoms from stable alphas and H ions. He also proposed that a particle of mass one but zero charge had to exist (neutron) to account for isotopes. With Wilson’s cloud chamber he had observed branched tracks of alpha particles at the end of their range. A Japanese visitor, Takeo Shimizu, built an automated Wilson cloud chamber capable of being expanded several times per second and built two cameras to photograph the tracks at right angles. Patrick Blackett, after graduating in 1921, took over the project when Shimizu returned to Japan. After modifications, by 1924 he had some 23,000 photographs showing some 400,000 tracks. Eight were forked, confirming Rutherford’s discovery. As Blackett later wrote: “The novel result deduced from these photographs was that the α was itself captured by the nitrogen nucleus with the ejection of a hydrogen atom, so producing a new and then unknown isotope of oxygen, 17O.”

As Blackett’s work confirmed, Rutherford had split the atom, and in doing so had become the world’s first successful alchemist, although this was a term that he did not like very much. Indeed, he also preferred to use the word “disintegration” rather than “transmutation”. When Rutherford and Soddy realised that radioactivity caused an element to naturally change into another, Soddy has written that he yelled “Rutherford, this is transmutation: the thorium is disintegrating and transmuting itself into argon (sic) gas.” Rutherford replied, “For Mike’s sake, Soddy, don’t call it transmutation. They’ll have our heads off as alchemists!”

In 1908 Rutherford had been awarded the Nobel Prize in Chemistry “for his investigations into the disintegration of the elements, and the chemistry of radioactive substances”. There was never a second prize for his detection of individual alpha particles, unearthing the nuclear structure of atoms, or the discovery of the proton. But few would doubt the immense contributions of this giant of physics. 

The post Rutherford, transmutation and the proton appeared first on CERN Courier.

]]>
Feature John Campbell recounts the events leading to Ernest Rutherford’s discovery of the proton, published in 1919. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_ruther-frontis_th.jpg
Centennial conference honours Feynman https://cerncourier.com/a/centennial-conference-honours-feynman/ Wed, 08 May 2019 09:24:42 +0000 https://preview-courier.web.cern.ch?p=83043 A memorial conference was held at the Institute of Advanced Studies at Nanyang Technological University in Singapore.

The post Centennial conference honours Feynman appeared first on CERN Courier.

]]>

2018 marked the 100th anniversary of the birth of Richard Feynman. As one of several events worldwide celebrating this remarkable figure in physics, a memorial conference was held at the Institute of Advanced Studies at Nanyang Technological University in Singapore from 22 to 24 October, co-chaired by Lars Brink, KK Phua and Frank Wilzcek. The format was one-hour talks with 45 minute discussions.

Pierre Ramond began the conference with anecdotes from his time as Feynman’s next-door neighbour at Caltech. He discussed Feynman the MIT undergraduate, his first paper and his work at Princeton as a graduate student. There, Feynman learnt about Dirac’s idea of summing over histories from Herbert Jehle. Jehle asked Feynman about it a few days later. He said that he had understood it and had derived the Schrödinger equation from it. Feynman’s adviser was John Wheeler. Wheeler was toying with the idea of a single electron travelling back and forth in time – were you to look at a slice of time you would observe many electrons and positrons. After his spell at Los Alamos, this led Feynman to the idea of the propagator, which considers antiparticles propagating backwards in time as well as particles propagating forwards. These ideas would soon underpin the quantum description of electromagnetism – QED – for which Feynman shared the 1965 Nobel Prize in Physics with Tomonaga and Schwinger.

Revolutionary diagrams

The propagator was the key to the epony­mous diagrams Feynman then formulated to compute the Lamb shift and other quantities. At the Singapore conference, Lance Dixon exposed how Feynman diagrams revolutionised the calculation of scattering amplitudes. He offered as an example the calculation of the anomalous magnetic moment of the electron, which has now reached five-loop precision and includes 12,672 diagrams. Dixon also discussed the importance of Feynman’s parton picture for understanding deep-inelastic scattering, and the staggeringly complex calculations required to understand data at the LHC.

George Zweig, the most famous of Feynman’s students, and the inventor of “aces” as the fundamental constituents of matter, gave a vivid talk, recounting that it took a long time to convince a sceptical Feynman about them. He described life in the shadows of the great man as a graduate student at Caltech in the 1960s. At that time Feynman wanted to solve quantum gravity, and was giving a course on the subject of gravitation. He asked the students to suppose that Einstein had never lived: how would particle physicists discuss gravity? He quickly explained that there must be a spin-two particle mediating the force; by the second lecture he had computed the precession of the perihelion of Mercury, a juncture that other courses took months to arrive at. Zweig recounted that Feynman’s failure to invent a renormalisable theory of quantum gravity affected him for many years. Though he did not succeed, his insights continue to resound today. As Ramond earlier explained, Feynman’s contribution to a conference in Chapel Hill in 1957, his first public intervention on the subject, is now seen as the starting point for discussions on how to measure gravitational waves.

Cristiane Morais-Smith spoke on Feynman’s path integrals, comparing Hamiltonian and Lagrangian formulations, and showing their importance in perturbative QED. Michael Creutz, the son of one of Feynman’s colleagues at Princeton and Los Alamos, showed how the path integral is also necessary to be able to work on the inherently non-perturbative theory of quantum chromodynamics. Morais-Smith went on to illustrate how Feynman’s path integrals now have a plethora of applications outside particle physics, from graphene to quantum Brownian motion and dissipative quantum tunnelling. Indeed, the conference did not neglect Feynman’s famous interventions outside particle physics. Frank Wilczek recounted Feynman’s famous insight that there is plenty of room at the bottom, telling of his legendary after-dinner talk in 1959 that foreshadowed many developments in nanotechnology. Wilczek concluded that there is plenty of room left in Hilbert space, describing entanglement, quantum cryptography, quantum computation and quantum simulations. Quantum computing is the last subject that Feynman worked hard on. Artur Ekert described the famous conference at MIT in 1981 when Feynman first talked about the subject. His paper from this occasion “Simulating Physics with Computers” was the first paper on quantum computers and set the ground for the present developments.

Biology hangout

Feynman was also interested in biology for a long time. Curtis Callan painted a picture of Feynman “hanging out” in Max Delbruck’s laboratory at Caltech, even taking a sabbatical at the beginning of the 1960s to work there, exploring the molecular workings of heredity. In 1969 he gave the famous Hughes Aerospace lectures, offering a grand overview of biology and chemistry – but this was also the time of the parton model and somehow that interest took over.

Robbert Dijkgraaf spoke about the interplay between art and science in Feynman’s life and thinking. He pointed out how important beauty is, not only in nature, but also in mathematics, for instance whether one uses a geometric or algebraic approach. Another moving moment of this wide-ranging celebration of Feynman’s life and physics was Michelle Feynman’s words about growing up with her father. She showed him both as a family man and also as a scientist, sharing his enthusiasm for so many things in life.

  • Recordings of the presentations are available online.

The post Centennial conference honours Feynman appeared first on CERN Courier.

]]>
Meeting report A memorial conference was held at the Institute of Advanced Studies at Nanyang Technological University in Singapore. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_FN-feynman.jpg
Paris event reflects on the history of the neutrino https://cerncourier.com/a/paris-event-reflects-on-the-history-of-the-neutrino/ Mon, 11 Mar 2019 17:08:47 +0000 https://preview-courier.web.cern.ch?p=13570 The first International Conference on the History of the Neutrino took place at the Université Paris Diderot in Paris on 5–7 September 2018.

The post Paris event reflects on the history of the neutrino appeared first on CERN Courier.

]]>

Neutrinos, discovered in 1956, play an exceptional role in particle and nuclear physics, as well as astrophysics, and their study has led to the award of several Nobel prizes. In recognition of their importance, the first International Conference on the History of the Neutrino took place at the Université Paris Diderot in Paris on 5–7 September 2018.

The purpose of the conference, which drew 120 participants, was to cover the main steps in the history of the neutrino since 1930, when Wolfgang Pauli postulated its existence to explain the continuous energy spectrum of the electrons emitted in beta decay. Specifically, for each topic in neutrino physics, the aim was to pursue an historical approach and follow as closely as possible the discovery or pioneering papers. Speakers were chosen as much as possible for their roles as authors or direct witnesses, or as players in the main events.

The first session, “Invention of a new particle”, started with the prehistory of the neutrino – that is, the establishment of the continuous energy spectrum in beta decay – before moving into the discoveries of the three flavour neutrinos. The second session, “Neutrinos in nature”, was devoted to solar and atmospheric neutrinos, as well as neutrinos from supernovae and Earth. The third session covered neutrinos from reactors and beams including the discovery of neutral-current neutrino interactions, in which the neutrino is not transformed into another particle like a muon or an electron. This discovery was made in 1973 by the Gargamelle bubble chamber team at CERN after a race with the HPWF experiment team at Fermilab.

The major theme of neutrino oscillations from the first theoretical ideas of Bruno Pontecorvo (1957) to the Mikheyev–Smirnov–Wolfenstein effect (1985), which can modify the oscillations when neutrinos travel through matter, was complemented by talks on the discovery of neutrino oscillations by Nobel laureates Takaaki Kajita and Art McDonald. In 1998, the Super-Kamiokande experiment, led by Kajita, observed the oscillation of atmospheric neutrinos, and in 2001 the Sudbury Neutrino Observatory experiment, led by McDonald, observed the oscillation of solar neutrinos.

The role of the neutrino in the Standard Model was discussed, as was its intrinsic nature. Although physicists have observed the rare process of double beta decay with neutrinos in the final state, neutrinoless double beta decay with no neutrinos produced has been searched  for for more than 30 years because its observation would prove that the neutrino is Majorana-type (its own antiparticle) and not Dirac-type.

To complete the panorama, the conference discussed neutrinos as messengers from the wider universe, from the Big Bang to violent phenomena such as gamma-ray bursts or active galactic nuclei. Delegates also discussed wrong hints and tracks, which play a positive role in the development of science, and the peculiar sociological aspects that are common to particle physics and astrophysics.

Following the conference, a website dedicated to the history of this fascinating particle was created: https://neutrino-history.in2p3.fr.

The post Paris event reflects on the history of the neutrino appeared first on CERN Courier.

]]>
Meeting report The first International Conference on the History of the Neutrino took place at the Université Paris Diderot in Paris on 5–7 September 2018. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_Fieldnotes_neutrino-1.png
BaBar celebrates its 25th anniversary https://cerncourier.com/a/babar-celebrates-its-25th-anniversary/ Mon, 11 Mar 2019 16:54:08 +0000 https://preview-courier.web.cern.ch?p=13560 BaBar has now chalked up more than 580 papers on CP violation and many other topics.

The post BaBar celebrates its 25th anniversary appeared first on CERN Courier.

]]>

On 11 December 2018, 25 years after its inaugural meeting, the BaBar collaboration came together at the SLAC National Accelerator Laboratory in California to celebrate its many successes. David Hitlin, BaBar’s first spokesperson, described the inaugural meeting of what was then called the Detector Collaboration for the PEP-II “asymmetric” electron–positron collider, which took place at SLAC at the end of 1993. By May 1994 the collaboration had chosen the name BaBar in recognition of its primary goal to study CP violation in the neutral B-B̅ meson system. Jonathan Dorfan, PEP-II project director, recounted how PEP-II was constructed by SLAC, LBL and LLNL. Less than six years later, PEP-II and the BaBar detector were built and the first collision events were collected on 26 May 1999. Twenty-five years on, and BaBar has now chalked up more than 580 papers on CP violation and many other topics.

BaBar has now chalked up more than 580 papers on CP violation and many other topics.

The “asymmetric” descriptor of the collider refers to Pier Oddone’s concept of using unequal electron and positron beam energies – tuned to 10.58 GeV, the mass of the ϒ(4S) meson and just above the threshold for producing a pair of B mesons. This relativistic boost enabled measurements of the distance between the points where the mesons decay, which is critical for the study of CP violation. Equally critical was the entanglement of the B meson and anti-B meson produced in the ϒ(4S) decay, as it marked whether it was the B0 or B̅0 that decayed to the same CP final state by tagging the flavour of the other meson.

By October 2000 PEP-II had achieved its design luminosity of 3 × 1033 cm–2 s–1 and less than a year later BaBar published its observation of CP violation in the B0 meson system based on a sample of 32 × 106 pairs of B0-B̅0 mesons – on the same day that Belle, its competitor at Japan’s KEK laboratory, published the same observation. These results led to Makoto Kobayashi and Toshihide Maskawa sharing the 2008 Nobel Prize in Physics. The ultimate luminosity achieved by PEP-II, in 2006, was 1.2 × 1034 cm–2s–1. BaBar continued to collect data on or near the ϒ(4S) meson until 2007 and in 2008 collected large samples of ϒ(2S) and ϒ(3S) mesons before PEP-II was shut down. In total, PEP-II produced 471 × 106 B-B̅ pairs for BaBar studies – as well as a myriad of other for other investigations.

The anniversary event also celebrated technical innovations, including “trickle injection” of beam particles into  PEP-II, which provided a nearly 40% increase in integrated luminosity; BaBar’s impressive particle identification, made possible by the DIRC detector; and the implementation of a computing model – spurred by PEP-II delivering significantly more than design luminosity – whereby countries provided in-kind computing support via large “Tier-A” centres. This innovation paved the way for CERN’s Worldwide LHC Computing Grid.

Notable physics results from BaBar include the first observation in 2007 of D–D̅  mixing, while in 2008 the collaboration discovered the long-sought ηb, the lowest energy particle of the bottomonium family. The team also searched for lepton-flavour violation in tau–lepton decays, publishing in 2010 what remain the most stringent limits on τ → μγ and τ → eγ branching fractions. In 2012, making it onto Physics World’s top-ten physics results of the year, the BaBar collaboration made the first direct observation of time-reversal violation by measuring the rates at which the B0 meson changes quantum states. Also published in 2012 was evidence for an excess of B̅→ D(*)τ ν̅τ decays, which challenges lepton universality and is an important part of the current Belle II and LHCb physics programmes. Several years after data-taking ended, it was recognised that BaBar’s data could also be mined for evidence of dark-sector objects such as dark photons, leading to the publication of two significant papers in 2014 and 2017. Another highlight, published last year, is a joint BaBar–Belle paper that resolved an ambiguity concerning the quark-mixing unitarity triangle.

Although BaBar stopped collecting data in 2008, this highly collegial team of researchers continues to publish impactful results. Moreover, BaBar alumni continue to bring their experience and expertise to subsequent experiments, ranging from ATLAS, CMS and LHCb at the LHC, Belle II at SuperKEKB, and long-baseline neutrino experiments (T2K, DUNE, HyperK) to dark-matter (LZ, SCDMS) and dark-energy (LSST) experiments in particle astrophysics.

The post BaBar celebrates its 25th anniversary appeared first on CERN Courier.

]]>
Meeting report BaBar has now chalked up more than 580 papers on CP violation and many other topics. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_Fieldnotes_babar.png
Fixed target, striking physics https://cerncourier.com/a/fixed-target-striking-physics/ Mon, 11 Mar 2019 16:00:54 +0000 https://preview-courier.web.cern.ch?p=13522 A strong tradition of innovation and ingenuity shows that, for CERN’s North Area, life really does begin at 40.

The post Fixed target, striking physics appeared first on CERN Courier.

]]>

As generations of particle colliders have come and gone, CERN’s fixed-target experiments have remained a backbone of the lab’s physics activities. Notable among them are those fed by the Super Proton Synchrotron (SPS). Throughout its long service to CERN’s accelerator complex, the 7 km-circumference SPS has provided a steady stream of high-energy proton beams to the North Area at the Prévessin site, feeding a wide variety of experiments. Sequentially named, they range from the pioneering NA1, which measured the photoproduction of vector and scalar bosons, to today’s NA64, which studies the dark sector. As the North Area marks 40 years since its first physics result, this hub of experiments large and small is as lively and productive as ever. Its users continue to drive developments in detector design, while reaping a rich harvest of fundamental physics results.

Specialised and precise

In fixed-target experiments, a particle beam collides with a target that is stationary in the laboratory frame, in most cases producing secondary particles for specific studies. High-energy machines like the SPS, which produces proton beams with a momentum up to 450 GeV/c, give the secondary products a large forward boost, providing intense sources of secondary and tertiary particles such as electrons, muons and hadrons. With respect to collider experiments, fixed-target experiments tend to be more specialised and focus on precision measurements that demand very high statistics, such as those involving ultra-rare decays.

Fixed-target experiments have a long history at CERN, forming essential building blocks in the physics landscape in parallel to collider facilities. Among these were the first studies of the quark–gluon plasma, the first evidence of direct CP violation and a detailed understanding of how nucleon spin arises from quarks and gluons. The first muons in CERN’s North Area were reported at the start of the commissioning run in March 1978, and the first physics publication – a measurement of the production rate of muon pairs by quark–antiquark annihilation as predicted by Drell and Yan – was published in 1979 by the NA3 experiment. Today, the North Area’s physics programme is as vibrant as ever.

The longevity of the North Area programme is explained by the unique complex of proton accelerators at CERN, where each machine is not only used to inject the protons into the next one but also serves its own research programme (for example, the Proton Synchrotron Booster serves the ISOLDE facility, while the Proton Synchrotron serves the Antiproton Decelerator and the n_TOF experiment). Fixed-target experiments using protons from the SPS started taking data while the ISR collider was already in operation in the late 1970s, continued during SPS operation as a proton–antiproton collider in the early 1980s, and again during the LEP and now LHC eras. As has been the case with collider experiments, physics puzzles and unexpected results were often at the origin of unique collaborations and experiments, pushing limits in several technology areas such as the first use of silicon-microstrip detectors.

The initial experimental programme in the North Area involved two large experimental halls: EHN1 for hadronic studies and EHN2 for muon experiments. The first round of experiments in EHN1 concerned studies of: meson photoproduction (NA1); electromagnetic form factors of pions and kaons (NA7); hadronic production of particles with large transverse momentum (NA3); inelastic hadron scattering (NA5); and neutron scattering (NA6). In EHN2 there were experiments devoted to studies with high-intensity muon beams (NA2 and NA4). A third, underground, area called ECN3 was added in 1980 to host experiments requiring primary proton beams and secondary beams of the highest intensity (up to 1010 particles per cycle).

Experiments in the North Area started a bit later than those in CERN’s West Area, which started operation in 1971 with 28 GeV/c protons supplied by the PS. Built to serve the last stage of the PS neutrino programme and the Omega spectrometer, the West Area zone was transformed into an SPS area in 1975 and is best known for seminal neutrino experiments (by the CDHS and CHARM collaborations, later CHORUS and NOMAD) and hadron-spectroscopy experiments with Omega. We are now used to identifying experimental collaborations by means of fancy acronyms such as ATLAS or ALICE, to mention two of the large LHC collaborations. But in the 1970s and the 1980s, one could distinguish between the experiments (identified by a sequential number) and the collaborations (identified by the list of the cities hosting the collaborating institutes). For instance CDHS stood for the CERN–Dortmund–Heidelberg–Saclay collaboration that operated the WA1 experiment in the West Area.

Los Alamos, SLAC, Fermilab and Brookhaven National Laboratory in the US, JINR and the Institute for High Energy Physics in Russia, and KEK in Japan, for example, also all had fixed-target programmes, some of which date back to the 1960s. As fixed-target programmes got into their stride, however, colliders were commanding the energy frontier. In 1980 the CERN North Area experimental programme was reviewed in a special meeting held in Cogne, Italy, and it was not completely obvious that there was a compelling physics case ahead. But it also led to highly optimised installations thanks to strong collaborations and continuous support from the CERN management. Advances in detectors and innovations such as silicon detectors and aerogel Cherenkov counters, plus the hybrid integration of bubble chambers with electronic detectors, led to a revamp in the study of hadron interactions at fixed-target experiments, especially for charmed mesons.

Physics landscape

Experiments at CERN’s North Area began shortly after the Standard Model had been established, when the scale of experiments was smaller than it is today. According to the 1979 CERN annual report, there were 34 active experiments at the SPS (West and North areas combined) and 14 were completed in 1978. This article cannot do justice to all of them, not even to those in the North Area. But over the past 40 years the experimental programme has clearly evolved into at least four main themes: probing nucleon structure with high-energy muons; hadroproduction and photoproduction at high energy; CP violation in very rare decays; and heavy-ion experiments (see “Forty years of fixed-target physics at CERN’s North Area”).

Aside from seminal physics results, fixed-target experiments at the North Area have driven numerous detector innovations. This is largely a result of their simple geometry and ease of access, which allows more adventurous technical solutions than might be possible with collider experiments. Examples of detector technologies perfected at the North Area include: silicon microstrips and active targets (NA11, NA14); rapid-cycling bubble chambers (NA27); holographic bubble chambers (NA25); Cherenkov detectors (CEDAR, RICH); liquid-krypton calorimeters (NA48); micromegas gas detectors (COMPASS); silicon pixels with 100 ps time resolution (NA62); time-projection chambers with dE/dx measurement (ISIS, NA49); and many more. The sheer amount of data to be recorded in these experiments also led to the very early adoption of PC farms for the online systems of the NA48 and COMPASS experiments.

Another key function of the North Area has been to test and calibrate detectors. These range from the fixed-target experiments themselves to experiments at colliders (such as LHC, ILC and CLIC), space and balloon experiments, and bent-crystal applications (such as UA9 and NA63). New detector concepts such as dual-readout calorimetry (DREAM) and particle-flow calorimetry (CALICE) have also been developed and optimised. Recently the huge EHN1 hall was extended by 60 m to house two very large liquid-argon prototype detectors to be tested for the Deep Underground Neutrino Experiment under construction in the US.

If there is an overall theme concerning the development of the fixed-target programme in the North Area, one could say that it was to be able to quickly evolve and adapt to address the compelling questions of the day. This looks set to remain true, with many proposals for new experiments appearing on the horizon, ranging from the study of very rare decays and light dark matter to the study of QCD with hadron and heavy-ion beams. There is even a study under way to possibly extend the North Area with an additional very-high-intensity proton beam serving a beam dump facility. These initiatives are being investigated by the Physics Beyond Collider study (see p20), and many of the proposals explore the high-intensity frontier complementary to the high-energy frontier at large colliders. Here’s to the next 40 years of North Area physics!

Forty years of fixed-target physics at CERN's North Area

Probing nucleon structure with high-energy muons

High-energy muons are excellent probes with which to investigate the structure of the nucleon. The North Area’s EHN2 hall was built to house two sets of muon experiments: the sequential NA2/NA9/NA28 (also known as the European Muon Collaboration, EMC), which made the observation that nucleons bound in nuclei are different from free nucleons; and NA4 (pictured), which confirmed the electroweak effects between the weak and electromagnetic interactions. A particular success of the North Area’s muon experiments concerned the famous “proton spin crisis”. In the late-1980s, contrary to the expectation by the otherwise successful quark–parton model, data showed that the proton’s spin is not carried by the quark spins. This puzzle interested the community for decades, compelling CERN to further investigate by building the NA47 Spin Muon collaboration experiment in the early 1990s (which established the same result for the neutron) and, subsequently, the COMPASS experiment (which studied the contribution of the gluon spins to the nucleon spin). A second phase of COMPASS still ongoing today, is devoted to nucleon tomography using deeply virtual Compton scattering and, for the first time, polarised Drell–Yan reactions. Hadron spectroscopy is another area of research at the North Area, and among recent important results from COMPASS is the measurement of pion polarisability, which is an important test of low-energy QCD.

Hadroproduction and photoproduction at high energy

Following the first experiment to publish data in the North Area (NA3) concerning the production of μ+μ pairs from hadron collisions, the ingenuity to combine bubble chambers and electronic detectors led to a series of experiments. The European Hybrid Spectrometer facility housed NA13, NA16, NA22, NA23 and NA27, and studied charm production and many aspects of hadronic physics, while photoproduction of heavy bosons was the primary aim of NA1. A measurement of the charm lifetime using the first ever microstrip silicon detectors was pioneered by the ACCMOR collaboration (NA11/NA32; see image of Robert Klanner next to the ACCMOR spectrometer in 1977), and hadron spectroscopy with neutral final states was studied by NA12 (GAMS), which employed a large array of lead glass counters, in particular a search for glueballs. To study μ+μ pairs from pion interactions at the highest possible intensities, the toroidal spectrometer NA10 was housed in the ECN3 underground cavern. Nearby in the same cavern, NA14 used a silicon active target and the first big microstrip silicon detectors (10,000 channels) to study charm photoproduction at high intensity. Later, experiment NA30 enabled a direct measurement of the π0 lifetime by employing thin gold foils to convert the photons from the π0 decays. Today, electron beams are used by NA64 to look for dark photons while hadron spectroscopy is still actively pursued, in particular at COMPASS.

CP violation and very rare decays

The discovery of CP violation in the decay of the long-lived neutral kaon to two pions at Brookhaven National Laboratory in 1964 was unexpected. To understand its origin, physicists needed to make a subtle comparison (in the form of a double ratio) between long- and short-lived neutral kaon decays in pairs of neutral and charged kaons. In 1987 an ambitious experiment (NA31) showed a deviation from one of the double ratios, providing the first evidence of direct CP violation (that is, it happens in the decay of the neutral mesons, not only in the mixing between neutral kaons). A second-generation experiment (NA48, pictured in 1996), located in ECN3 to accept a much higher primary-proton intensity, was able to measure the four decay modes concurrently thanks to the deflection of a tiny fraction of the primary proton beam into a downstream target via channelling in a “bent” crystal. NA48 was approved in 1991 when it became evident that more precision was needed to confirm the original observation (a competing programme at Fermilab called E731 did not find a significant deviation from the unity of the double ratio). Both KTeV (the follow-up Fermilab experiment) and NA48 confirmed NA31’s results, firmly establishing direct CP violation. Continuations of the NA48 experiments studied rare decays of the short-lived neutral kaon and searched for direct CP violation in charged kaons. Nowadays the kaon programme continues with NA62, which is dedicated to the study of very rare K+ π+νν decays and is complementary to the B-meson studies performed by the LHCb experiment.

Heavy-ion experiments

In the mid-1980s, with a view to reproduce in the laboratory the plasma of free quarks and gluons predicted by QCD and believed to have existed in the early universe, the SPS was modified to accelerate beams of heavy ions and collide them with nuclei. The lack of a single striking signature of the formation of the plasma demands that researchers look for as many final states as possible, exploiting the evolution of standard observables (such as the yield of muon pairs from the Drell–Yan process or the production rate of strange quarks) as a function of the degree of overlap of the nuclei that participate in the collision (centrality). By 2000 several experiments had, according to CERN Courier in March that year, found “tantalising glimpses of mechanisms that shaped our universe”. The experiments included NA44, NA45, NA49, NA50, NA52 and NA57, as well as WA97 and WA98 in the West Area. Among the most popular signatures observed was the suppression of the J/ψ yield in ion–nucleus collisions with respect to proton–proton collisions, which was seen by NA50. Improved sensitivity to muon pairs was provided by the successor experiment NA60. The current heavy-ion programme at the North Area includes NA61/SHINE (see image), the successor of NA49, which is studying the onset of phase transitions in dense quark–gluon matter at different beam energies and for different beam species. Studies of the quark–gluon plasma continue today, in particular at the LHC and at RHIC in the US. At the same time, NA61/SHINE is measuring the yield of mesons from replica targets for neutrino experiments worldwide and particle production for cosmic-ray studies.

The post Fixed target, striking physics appeared first on CERN Courier.

]]>
Feature A strong tradition of innovation and ingenuity shows that, for CERN’s North Area, life really does begin at 40. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_north-frontis-1.png
CERN’s ultimate act of openness https://cerncourier.com/a/cerns-ultimate-act-of-openness/ Mon, 11 Mar 2019 14:50:45 +0000 https://preview-courier.web.cern.ch?p=13483 The seed that led CERN to relinquish ownership of the web in 1993 was planted when the Organization formally came into being.

The post CERN’s ultimate act of openness appeared first on CERN Courier.

]]>

At a mere 30 years old, the World Wide Web already ranks as one of humankind’s most disruptive inventions. Developed at CERN in the early 1990s, it has touched practically every facet of life, impacting industry, penetrating our personal lives and transforming the way we transact. At the same time, the web is shrinking continents and erasing borders, bringing with it an array of benefits and challenges as humanity adjusts to this new technology.

This reality is apparent to all. What is less well known, but deserves recognition, is the legal dimension of the web’s history. On 30 April 1993, CERN released a memo (see image) that placed into the public domain all of the web’s underlying software: the basic client, basic server and library of common code. The document was addressed “To whom it may concern” – which would suggest the authors were not entirely sure who the target audience was. Yet, with hindsight, this line can equally be interpreted as an unintended address to humanity at large.

The legal implication was that CERN relinquished all intellectual property rights in this software. It was a deliberate decision, the intention being that a no-strings-attached release of the software would “further compatibility, common practices, and standards in networking and computer supported collaboration” – arguably modest ambitions for what turned out to be such a seismic technological step. To understand what seeded this development you need to go back to the 1950s, at a time when “software” would have been better understood as referring to clothing rather than computing.

European project

CERN was born out of the wreckage of World War II, playing a role, on the one hand, as a mechanism for reconciliation between former belligerents, while, on the other, offering European nuclear physicists the opportunity to conduct their research locally. The hope was that this would stem the “brain drain” to the US, from a Europe still recovering from the devastating effects of war.

In 1953, CERN’s future Member States agreed on the text of the organisation’s founding Convention, defining its mission as providing “for collaboration among European States in nuclear research of a pure scientific and fundamental character”. With the public acutely aware of the role that destructive nuclear technology had played during the war, the Convention additionally stipulated that CERN was to have “no concern with work for military requirements” and that the results of its work, were to be “published or otherwise made generally available”.

In the early years of CERN’s existence, the openness resulting from this requirement for transparency was essentially delivered through traditional channels, in particular through publication in scientific journals. Over time, this became the cultural norm at CERN, permeating all aspects of its work both internally and with its collaborating partners and society at large. CERN’s release of the WWW software into the public domain, arguably in itself a consequence of the openness requirement of the Convention, could be seen as a precursor to today’s web-based tools that represent further manifestations of CERN’s openness: the SCOAP3 publishing model, open-source software and hardware, and open data.

Perhaps the best measure for how ingrained openness is in CERN’s ethos as a laboratory is to ask the question: “if CERN would have known then what it knows now about the impact of the World Wide Web, would it still have made the web software available, just as it did in 1993?” We would like to suggest that, yes, our culture of openness would provoke the same response now as it did then, though no doubt a modern, open-source licensing regime would be applied.

A culture of openness

This, in turn, can be viewed as testament and credit to the wisdom of CERN’s founders, and to the CERN Convention, which remains the cornerstone of our work to this day.

The post CERN’s ultimate act of openness appeared first on CERN Courier.

]]>
Feature The seed that led CERN to relinquish ownership of the web in 1993 was planted when the Organization formally came into being. https://cerncourier.com/wp-content/uploads/2019/03/WWW_logo_by_Robert_Cailliau2.jpg
Reviews https://cerncourier.com/a/reviews-2/ Mon, 11 Mar 2019 14:02:49 +0000 https://preview-courier.web.cern.ch?p=13456 The Soviet Atomic Project: How the Soviet Union Obtained the Atomic Bomb. Advances in Particle Therapy: A multidisciplinary approach.

The post Reviews appeared first on CERN Courier.

]]>
The Soviet Atomic Project: How the Soviet Union Obtained the Atomic Bomb
by Lee G Pondrom
World Scientific

“Leave them in peace. We can always shoot them later.” Thus spoke Soviet Union leader Josef Stalin, in response to a query by Soviet security and secret police chief Lavrentiy Beria about whether research in quantum mechanics and relativity (considered by Marxists to be incompatible with the principles of dialectical materialism) should be allowed. With these words, a generation of Soviet physical scientists were spared a disaster like the one perpetrated on Soviet agriculture by Trofim Lysenko’s politically correct, pseudoscientific theories of genetics. The reason behind this judgement was the successful development of nuclear weapons by Soviet physical scientists and the recognition by Stalin and Beria of the essential role that these “bourgeois” sciences played in that development.

Political intrigue, the arms race, early developments of nuclear science, espionage and more are all present in this gripping book, which provides a comprehensive account of the intensive programme the Soviets embarked on in 1945, immediately after Hiroshima, to catch up with the US in the area of nuclear weapons. A great deal is known about the Manhattan project, from the key scientists involved, to the many Los Alamos incidents – such as Fermi’s determination of the Alamogordo test-blast energy using scraps of paper and Feynman’s ability to crack his Los Alamos colleagues’ safes – that are intrinsic parts of the US nuclear/particle-physics community’s culture. On the contrary, little is known, at least in the West, about the huge effort made by the war-ravaged Soviet Union in less than five years to reach strategic parity with the US.

Pondrom, a prominent experimental particle physicist with a life-long interest in Russia and its language, provides an intriguing narrative. It is based on a thorough study of available literature plus a number of original documents – many of which he translated himself – that gives a fascinating insight into this history-changing enterprise and into the personalities of the exceptional people behind it.

The Soviet Atomic Project

The success of the Soviet programme was primarily due to Igor Kurchatov, a gifted experimental physicist and outstanding scientific administrator, who was equally at ease with laboratory workers, prominent theoretical physicists and the highest leaders in government, including Beria and Stalin himself. Saddled with developing several huge and remotely located laboratories from scratch, he remained closely involved in many important nitty-gritty scientific and engineering problems. For example, Kurchatov participated hands-on and full-time in the difficult commissioning of Reactor A, the first full-scale reactor for plutonium-239 production at the sprawling Combine #817 laboratory, receiving, along the way, a radiation dose that was 100 times the safe limit that he had established for laboratory staff members.

Beria was the overall project controller and ultimate decision-maker. Although best known for his role as Stalin’s ruthless enforcer – Pondrom describes him as “supreme evil,” Sakharov as a “dangerous man” – he was also an extraordinary organiser and a practical manager. When asked in the 1970s, long after Beria’s demise, how best to develop a Soviet equivalent of Silicon Valley, Soviet Academy of Sciences president A P Alexandrov answered “Dig up Beria.” Beria promised project scientists improved living conditions and freedom from persecution if they performed well (and that they would “be sent far away” if they didn’t). His daily access to Stalin was critical for keeping the project on track. Most of the project’s manual construction work used slave labour from Beria’s gulag.

Both the US and Soviet projects were monumental in scope; Pondrom estimates the Manhattan project’s scale to be about 2% of the US economy. The Soviet’s project scale was similar, but in an economy one-tenth the size. The Soviets had some advantage from the information gathered by espionage (and the simple fact that they knew the Manhattan project succeeded). Also, German scientists interned in Russia for the project played important support roles, especially in the large-scale purification of reactor-grade natural uranium. In addition, there was a nearly unlimited supply of unpaid labourers, as well as German prisoners of war with scientific and engineering backgrounds whose participation in the project was rewarded by better living conditions.

The book is crisply written and well worth the read. The text includes a number of translated segments of official documents plus extracts from memoirs of some of the people involved. So, although Pondrom sprinkles his opinions throughout, there is sufficient material to permit readers to make their own judgements. He doesn’t shirk from explaining some of the complex technical issues, which he (usually) addresses clearly and concisely. The appendices expand on technical issues, some on an elementary level for non-physicists, and others, including isotope extraction techniques, nuclear reaction issues and encryption, in more detail, much of which was new to me.

On the other hand, the confusing assortment of laboratories, their locations, leaders and primary tasks begged for some kind of summary or graphics. The simple chart describing the Soviet’s complex espionage network in the US was useful for keeping track of the roles of the persons involved; a similar chart for the laboratories and their roles would have been equally valuable. The book would also have benefited from a final edit that might have eliminated some of the repetition and caught some obvious errors. But these are minor faults in an engaging, informative book.

Stephen L Olsen, University of Chinese Academy of Sciences.

Advances in Particle Therapy: A multidisciplinary approach
b
y Manjit Dosanjh and Jacques Bernier (eds)
CRC Press, Taylor and Francis Group

A new volume in the CRC Press series on Medical Physics and Biomedical Engineering, this interesting book on particle therapy is structured in 19 chapters, each written by one or more co-authors out of a team of 49 experts (including the two editors). Most are medical physicists, radiation oncologists and radiobiologists who are well renowned in the field.

Advances in Particle Therapy

The opening chapter provides a brief and useful summary of the evolution of modern radiation oncology, starting from the discovery of X  rays up to the latest generation of proton and carbon-ion accelerators. The second and third chapters are devoted to the radiobiological aspects of particle therapy. After an introductory part where the concepts of relative biological effectiveness (RBE) and oxygen-enhancement ratio are defined, this section of the book goes on to review the most recent knowledge gained in the field, from DNA structure to the production of radiation-induced damage, to secondary cancer risk. The conclusion is that, as biological effects and clinical response are functions of a broad range of parameters, we are still far from a complete understanding of all radiobiological aspects underlying particle therapy, as well as from a universally accepted RBE model providing the optimum RBE value to be used for any given treatment.

Chapter 4 and, later, chapter 18 are dedicated to particle-therapy technologies. The first provides a simple explanation of the operating principles of particle accelerators and then goes into the details of beam delivery systems and dose conformation devices. Chapter 18 recalls the historical development of particle therapy in Europe, first with the European Light Ion Medical Accelerator (EULIMA) study and Proton-Ion Medical Machine Study (PIMMS), and then with the design and construction of the HIT, CNAO and MedAustron clinical facilities (CERN Courier January/February 2018 p25). It then provides an outlook on ongoing and expected future technological developments in accelerator design.

Chapter 5 discusses the general requirements for setting up a particle therapy centre, while the following chapter provides an extensive review of imaging techniques for both patient positioning and treatment verification. These are made necessary by the rapid spread of active beam delivery technologies (scanning) and robotic patient positioning systems, which have strongly improved dose delivery. Chapter 7 reviews therapeutic indications for particle therapy and explains the necessity to integrate it with all other treatment modalities so that oncologists can decide on the best combination of therapies for each individual patient. Chapter 8 reports on the history of the European Network of Light Ion Hadron Therapy (ENLIGHT) and its role in boosting collaborative efforts in particle therapy and in training specialists.

The central part of the book (chapters 9 to 15) reviews worldwide clinical results and indications for particle therapy from different angles, pointing out the inherent difficulties in comparing conventional radiation therapy and particle therapy. It analyses the two perspectives under which the dosimetric properties of particles can translate into clinical benefit: decreasing the dose to normal tissue to reduce complications, or scaling the dose to the tumour to improve tumour control without increasing the dose to normal tissue.

Chapter 16 discusses the economic aspects of particle therapy, such as cost-effectiveness and budget impact, while the following chapter describes the benefits of a “rapid learning health care” system. The last chapter discusses global challenges in radiation therapy, such as  how to bring medical electron linac technology to low- and middle-income countries (CERN Courier March 2017 p31). I found this last chapter slightly confusing as I did not understand what is meant by “radiation rotary” and I could not fully grasp the mixing-up of different topics, such as particle therapy and nuclear detonation-terrorism. This part also seemed too US-focussed when discussing the various initiatives, and I was not in agreement with some of the statements (e.g. that particle therapy has undergone a cost reduction by an order of magnitude or more in the past 10 years).

Overall, this book provides a useful compendium of state-of-the-art particle therapy and each chapter is supported by an extensive bibliography, meeting the expectations of both experts and readers interested in gaining an overview of the field. The essay is well structured, and enables readers to go through only selected chapters and in the order that they prefer. Some knowledge of radiobiology, clinical oncology and accelerator technology is assumed. It is disappointing that clinical dosimetry and treatment planning are not addressed other than in a brief mention in chapter 5, but perhaps this is something to consider for a second edition.

Marco Silari, CERN.

Mad maths
Theatre, CERN Globe
24 January 2019

Do you remember your maths high-school teachers?  Were they strict?  Funny? Extraordinary?  Boring?  The theatre comedy “Mad maths” presents the two most unusual teachers you can imagine. Armed with chalk and measuring tapes, Mademoiselle X and Mademoiselle Y aim to heal all those with maths phobia, and teach the audience more about their favourite subject.

On 24 January CERN’s fully booked Globe of Science and Innovation turned into a bizarre classroom. Marching along well-defined 90° angles, and meticulously measuring everything around them, the comedians Sophie Leclercq and Garance Legrou play with numbers and fight at the blackboard to make maths entertaining. The dialogues are juiced up with rap and music, spiced by friendly maths jargon, and seasoned with a hint of craziness. Bumping with trigonometry, philosophising about the number zero, and inventing new counting systems with dubious benefits, the rhythm grows exponentially. For example, did you know that some people’s mood goes up and down like a sine function? That you can make music with fractions? And that some bureaucratic steps are noncommutative?

This comedy show originated from an idea by Olivier Faliez and Kevin Lapin from the French theatre company Sous un autre angle. First studying maths at the university, then attending theatre school, Faliez combined his two passions in 2003 to create an entertaining programme based on maths-driven jokes and turns of event.

Perfect for families with children, this French play has already been performed more than 500 times, especially at science festivals and schools. The topics are customised depending on the level of the students. Future showings are scheduled in Castanet (15 March), Les Mureaux (22 March) and in several schools in France and other countries. Teachers and event organisers who are interested in the show are advised to contact Sophie Leclercq.

At times foolish, at times witty, it is worth watching if and only if you want to unwind and rediscover maths from a different perspective.

Letizia Diamante, CERN.

The Life, Science and Times of Lev Vasilevich Shubnikov, Pioneer of Soviet Cryogenics
by L J Reinders
Springer

This book is a biography of Russian physicist Lev Vasilevich Shubnikov, whose work is scarcely known despite its importance and broad reach. It is also a portrayal of the political and ideological environment existing in the Soviet Union in the late 1930s under Stalin’s repressive regime.

The Life, Science and Times of Lev Vasilevich Shubnikov

While at Leiden University in the Netherlands, which at the time had the most advanced laboratory for low-temperature physics in the world, Shubnikov co-discovered the Shubnikov–De Haas effect: the first observation of quantum-mechanical oscillations of a physical quantity (in this case the resistance of bismuth) at low temperatures and high magnetic fields.

In 1930 Shubnikov went to Kharkov (as it is called in Russian) in the Ukraine, where he built up the first low-temperature laboratory in the Soviet Union. There he led an impressive scientific programme and, together with his team, he discovered what is now known as type-II superconductivity (or the Shubnikov phase) and nuclear paramagnetism. In addition, independently of and almost simultaneously with Meissner and Ochsenfield, they observed the complete diamagnetism of superconductors (today known as the Meissner effect).

In 1937, aged just 36, Shubnikov was arrested, processed by Stalin’s regime and executed “for no other reason than that he had shown evidence of independent thought”, as the author states.

Based on thorough document research and a collection of memories from people who knew Shubnikov, this book will appeal not only to those curious about this physicist, but also to readers interested in the history of Soviet science, especially the development of Soviet physics in the 1930s and the impact that Stalin’s regime had on it.

Virginia Greco, CERN.

The Workshop and the World, what ten thinkers can teach us about science and authority
by Robert P Crease
W. W. Norton & Company

In this book, science historian Robert Crease discusses the concept of scientific authority, how it has changed along the centuries, and how society and politicians interact with scientists and the scientific process – which he refers to as the “workshop”.

The Workshop and the World

Crease begins with an introduction about current anti-science rhetoric and science denial – the most evident manifestation of which is probably the claim that “global warming is a hoax perpetrated by scientists with hidden agendas”.

Four sections follow. In part one, the author introduces the first articulation of scientific authority through the stories of three renowned scientists and philosophers: Francis Bacon, Galileo Galilei and René Descartes. Here, some vulnerabilities of the authority of the scientific workshop emerge, but they are discussed further in the second section of the book through the stories of thinkers like Giambattista Vico, Mary Shelley and Auguste Comte.

Part three attempts to understand the deeply complicated relationship between the workshop and the world, described through the stories of Max Weber, Kemal Atatürk and his precursors, and Edmund Husserl. The final section is all about reinventing authority and is discussed through the work of Hannah Arendt, a thinker who barely escaped the Holocaust and who provided a deep analysis of authority as well as provding clues as to how to restore it.

With this brilliantly written essay, Crease aims to explore what practising science for the common good means and to understand what makes a social and political atmosphere in which science denial can flourish. Finally, Crease tries to suggest what can be done to ensure that science and scientists regain the trust of the people.

Virginia Greco, CERN.

The post Reviews appeared first on CERN Courier.

]]>
Review The Soviet Atomic Project: How the Soviet Union Obtained the Atomic Bomb. Advances in Particle Therapy: A multidisciplinary approach. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_reviews-bomb.jpg
Reviews https://cerncourier.com/a/reviews/ Thu, 24 Jan 2019 09:00:39 +0000 https://preview-courier.web.cern.ch/?p=13127 Lost in Math – How beauty leads physics astray. Amaldi’s last letter to Fermi: a monologue.

The post Reviews appeared first on CERN Courier.

]]>
Lost in Math – How beauty leads physics astray
by Sabine Hossenfelder
Basic Books

The eye of the beholder

In Lost in Math, theoretical physicist Sabine Hossenfelder embarks on a soul-searching journey across contemporary theoretical particle physics. She travels to various countries to interview some of the most influential figures of the field (but also some “outcasts”) to challenge them, and be challenged, about the role of beauty in the investigation of nature’s laws.

Colliding head-on with the lore of the field and with practically all popular-science literature, Hossenfelder argues that beauty is overrated. Some leading scientists say that their favourite theories are too beautiful not to be true, or possess such a rich mathematical structure that it would be a pity if nature did not abide by those rules. Hossenfelder retorts that physics is not mathematics, and names examples of extremely beautiful and rich maths that does not describe the world. She reminds us that physics is based on data. So, she wonders, what can be done when an entire field is starved of experimental breakthroughs?

Confirmation bias

Nobel laureate Steven Weinberg, interviewed for this book, argues that experts call “beauty” the experience-based feeling that a theory is on a good track. Hossenfelder is sceptical that this attitude really comes from experience. Maybe most of the people who chose to work in this field were attracted to it, in the first place, because they like mathematics and symmetries, and would not have worked in the field otherwise. We may be victims of confirmation bias: we choose to believe that aesthetic sense leads to correct theories; hence, we easily recall to memory all of the correct theories that possess some quality of beauty, while we do not pay equal attention to the counterexamples. Dirac and Einstein, among many, vocally affirmed beauty as a guiding principle, and achieved striking successes by following its guidance; however, they also had, as Hossenfelder points out, several spectacular failures that are less well known. Moreover, a theoretical sense of beauty is far from universal. Copernicus made a breakthrough because he sought a form of beauty that differed from those of his predecessors, making him think out of the box; and by today’s taste, Kepler’s solar system of platonic solids feels silly and repulsive.

Hossenfelder devotes attention to a concept that is particularly relevant to contemporary particle physics: the “naturalness principle” (see Understanding naturalness). Take the case of the Higgs mass: the textbook argument is that quantum corrections go wild for the Higgs boson, making any mass value between zero and the Planck mass a priori possible; however, its value happens to be closer to zero than to the Planck mass by a factor of 1017. Hence, most particle physicists argue that there must be an almost perfect cancellation of corrections, a problem known as the “hierarchy problem”. Hossenfelder points out that implicit in this simple argument is that all values between zero and the Planck mass should be equally likely. “Why,” she asks, “are we assuming a flat probability, instead of a logarithmic (or whatever other function) one?” In general, we say that a new theory is necessary when a parameter value is unlikely, but she argues that we can estimate the likeliness of that value only when we have a prior likelihood function, for which we would need a new theory.

New angles

Hossenfelder illustrates various popular solutions to this naturalness problem, which in essence all try to make small values of the Higgs mass much more likely than large ones. She also discusses string theory, as well as multiverse hypotheses and anthropic solutions, exposing their shortcomings. Some of her criticisms may recall Lee Smolin’s The Trouble with Physics and Peter Woit’s Not Even Wrong, but Hossenfelder brings new angles to the discussion.

This book comes out at a time when more and more specialists are questioning the validity of naturalness-inspired predictions. Many popular theories inspired by the naturalness problem share an empirical consequence: either they manifest themselves soon in existing experiments, or they definitely fail in solving the problems that they were invented for.

Hossenfelder describes in derogatory terms the typical argumentative structure of contemporary theory papers that predict new particles “just around the corner”, while explaining why we did not observe them yet. She finds the same attitude in what she calls the “di-photon diarrhoea”, i.e., the prolific reaction of the same theoretical community to a statistical fluctuation at a mass of around 750 GeV in the earliest data from the LHC’s Run 2.

The author explains complex matters at the cutting edge of theoretical physics research in a clear way, with original metaphors and appropriate illustrations. With this book, Hossenfelder not only reaches out to the public, but also invites it to join a discourse that she is clearly passionate about. The intended readership ranges from fellow scientists to the layperson, also including university administrators and science policy makers, as is made explicit in an appendix devoted to practical suggestions for various categories of readers.

While this book will mostly attract attention for its pars destruens, it also contains a pars construens. Hossenfelder argues for looking away from the lamppost, both theoretically and experimentally. Having painted naturalness arguments as a red herring that drives attention away from the real issues, and acknowledging throughout the book that when data offer no guidance there is no other choice than following some non-empirical assessment criteria, she advocates other criteria that deserve better prominence, such as the internal consistency of the theoretical foundations of particle physics.

As a non-theorist my opinion carries little weight, but my gut feeling is that this direction of investigation, although undeniably crucial, is not comparably “fertile”. On the other hand, Hossenfelder makes it clear that she sees nothing scientific in this kind of fertility, and even argues that bibliometric obsessions played a big role in creating what she depicts as a gigantic bibliographical bubble. Inspired by that, Hossenfelder also advises learning how to recognise and mitigate biases, and building a culture of criticism both in the scientific arena and in response to policies that create short-term incentives, going against the idea of exploring less conventional ideas. Regardless of what one may think about the merits of naturalness or other non-empirical criteria, I believe that these suggestions are uncontroversially worthy of consideration.

Andrea Giammanco, UCLouvain, Louvain-la-Neuve, Belgium.

Amaldi’s last letter to Fermi: a monologue
Theatre, CERN Globe
11 September 2018

Ideas shaker

On the occasion of the 110th anniversary of the birth of Italian physicist Edoardo Amaldi (1908–1989), CERN hosted a new production titled “Amaldi l’italiano, centodieci e lode!” The title is a play on words concerning the top score at an Italian university (“110 cum laude”) and the production is a well-deserved recognition of a self-confessed “ideas shaker” who was one of the pioneers
in the establishment of CERN, the European Space Agency (ESA) and the Italian National Institute for Nuclear Physics (INFN).

The nostalgic monologue opens with Amaldi, played by Corrado Calda, sitting at his desk and writing a letter to his mentor, Enrico Fermi. Set on the last day of Amaldi’s life, the play retraces some of his scientific, personal and historical memories, which pass by while he writes.

It begins in 1938 when Amaldi is part of an enthusiastic group of young scientists, led by Fermi and nicknamed “Via Panisperna boys” (boys from Panisperna Road, the location of the Physics Institute of the University of Rome). Their discoveries on slow neutrons led to Fermi’s Nobel Prize in Physics that year.

Then, suddenly, World War II  begins and everything falls apart. Amaldi writes about his frustrations to his teacher, who had passed away but is still close to him. “While physicists were looking for physical laws, Europe sank into racial laws,” he despairs. Indeed, most of his colleagues and friends, including Fermi who had a Jewish wife, moved to the US. Left alone in Italy, Amaldi decided to stop his studies on fission and focus on cosmic rays, a type of research that required less resources and was not related to military applications.

Out of the ruins

After World War II, while in Italy there was barely enough money to buy food, the US was building state-of-the-art particle-physics detectors. Amaldi described his strong temptation to cross the ocean, and re-join with Fermi. However, he decided to stay in war-torn Europe and help European science grow out of the ruins. He worked to achieve his dream of “a laboratory independent from military organisations, where scientists from all over the world could feel at home” – today know as CERN. He was general secretary of CERN between 1952 and 1954, before its official foundation in September 1954.

This beautiful monologue is interspersed by radio messages from the epoch, which announce salient historical facts. These create a factual atmosphere that becomes less and less tense as alerts about the Nazi’s declarations and bombs are replaced by news about the first women’s vote, the landing of the first person on the Moon, and disarmament movements.

Written and directed by Giusy Cafari Panico and Corrado Calda, the play was composed after consulting with Edoardo’s son, Ugo Amaldi, who was present at the inaugural performance. The script is so rich in information that you leave the theatre feeling you now know a lot about scientific endeavours, mindsets and the general zeitgeist of the last century. Moreover, the play touches on some topics that are still very relevant today, including: brain drain, European identity, women in science and the use of science for military purposes.

The event was made possible thanks to the initiative of Ugo Amaldi, CERN’s Lucio Rossi, the Edoardo Amaldi Association (Fondazione Piacenza e Vigevano, Italy), and several sponsors. The presentation was introduced by former CERN Director-General Luciano Maiani, who was Edoardo Amaldi’s student, and current CERN Director-General Fabiola Gianotti, who expressed her gratitude for Amaldi’s contribution in establishing CERN.

Letizia Diamante, CERN.

Topological and Non-Topological Solitons in Scalar Field Theories
by Yakov M Shnir
Cambridge University Press

In the 19th century, the Scottish engineer John Scott Russell was the first to observe what he called a “wave of transition” while watching a boat drawn along a channel by a pair of horses. This phenomenon is now referred to as a soliton and described mathematically as a stable, non-dissipative wave packet that maintains its shape while propagating at a constant velocity.

Solitons emerge in various nonlinear physical systems, from nonlinear optics and condensed matter to nuclear physics, cosmology and supersym­metric theories.

Structured in three parts, this book provides a comprehensive introduction to the description and construction of solitons in various models. In the first two chapters of part one, the author discusses the properties of topological solitons in the completely integrable Sine-Gordon model and in the non-integrable models with polynomial potentials. Then, in chapter three, he introduces solitary wave solutions of the Korteweg–de Vries equation, which provide an example of non-topological solitons.

Part two deals with higher dimensional nonlinear theories. In particular, the properties of scalar soliton configurations are analysed in two 2+1 dimension systems: the O(3) nonlinear sigma model and the baby Skyrme model. Part three focuses mainly on the solitons in three spatial dimensions. Here, the author covers stationary Q-balls and their properties. Then he discusses soliton configurations in the Skyrme model (called skyrmions) and the knotted solutions of the Faddev–Skyrme model (hopfions). The properties of the related deformed models, such as the Nicole and the Aratyn–Ferreira–Zimerman model, are also summarised.

Based on the author’s lecture notes for a graduate-level course, this book is addressed at graduate students in theoretical physics and mathematics, as well as researchers interested in solitons.

Virginia Greco, CERN.

Universal Themes of Bose–Einstein Condensation
by Nick P Proukakis, David W Snoke and Peter B Littlewood
Cambridge University Press

The study of Bose–Einstein condensation (BEC) has undergone an incredible expansion during the last 25 years. Back then, the only experimentally realised Bose condensate was liquid helium-4, whereas today the phenomenon has been observed in a number of diverse atomic, optical and condensed-matter systems. The turning point for BEC came in 1995, when three different US groups reported the observation of BEC in trapped, weakly interacting atomic gases of rubidium-87, lithium-7 and sodium-23 within weeks of one another. These studies led to the 2001 Nobel Prize in Physics being jointly awarded to Eric Cornell, Wolfgang Ketterle and Carl Wieman.

This book is a collection of essays written by leading experts on various aspects and in different branches of BEC, which is now a broad and interdisciplinary area of modern physics. Composed of four parts, the volume starts with the history of the rapid development of this field and then takes the reader through the most important results.

The second part provides an extensive overview of various general themes related to universal features of Bose–Einstein condensates, such as the question of whether BEC involves spontaneous symmetry breaking, of how the ideal Bose gas condensation is modified by interactions between the particles, and the concept of universality and scale invariance in cold-atom systems. Part three focuses on active research topics in ultracold environments, including optical lattice experiments, the study of distinct sound velocities in ultracold atomic gases – which has shaped our current understanding of superfluid helium – and quantum turbulence in atomic condensates.

Part four is dedicated to the study of condensed-matter systems that exhibit various features of BEC, while in part five possible applications of the study of condensed matter and BEC to answer questions on astrophysical scales are discussed.

Virginia Greco, CERN.

Zeros of Polynomials and Solvable Nonlinear Evolution Equations
by Francesco Calogero
Cambridge University Press

This concise book discusses the mathematical tools used to model complex phenomena via systems of nonlinear equations, which can be useful to describe many-body problems.

Starting from a well-established approach to solvable dynamical systems identification, the author proposes a novel algorithm that allows some of the restrictions of this approach to be eliminated and, thus, identifies more solvable/integrable N-body problems. After reporting this new differential algorithm to evaluate all the zeros of a generic polynomial of arbitrary degree, the book presents many examples to show its application and impact. The author first discusses systems of ordinary differential equations (ODEs), including second-order ODEs of Newtonian type, and then moves on to systems of partial differential equations and equations evolving in discrete time-steps.

This book is addressed to both applied mathematicians and theoretical physicists, and can be used as a basic text for a topical course for advanced undergraduates.

Virginia Greco, CERN.

The post Reviews appeared first on CERN Courier.

]]>
Review Lost in Math – How beauty leads physics astray. Amaldi’s last letter to Fermi: a monologue. https://cerncourier.com/wp-content/uploads/2019/01/CCJanFeb19_Rev-eye.png
Hadrons at Finite Temperature https://cerncourier.com/a/hadrons-at-finite-temperature/ Fri, 30 Nov 2018 14:04:37 +0000 https://preview-courier.web.cern.ch/?p=101298 This monograph explains the ideas involved in the theoretical analysis of the data produced in heavy-ions collisions.

The post Hadrons at Finite Temperature appeared first on CERN Courier.

]]>
By Samirnath Mallik and Sourav Sarkar
Cambridge University Press

Hadrons at Finite Temperature

In high-energy physics laboratories, experiments use heavy-ion collisions to investigate the properties of matter at extremely high temperature and density, and to study the quark–gluon plasma. This monograph explains the ideas involved in the theoretical analysis of the data produced in such experiments. It comprises three parts, the first two of which are independent but lay the ground for the topics addressed later.

The book starts with an overview of the (vacuum) theory of hadronic interactions at low energy: vacuum propagators for fields of different spins are introduced and then the phenomenon of spontaneous symmetry breaking leading to Goldstone bosons and chiral perturbation theory are discussed.

The second part covers equilibrium thermal field theory, which is formulated in the real time method. Finally, in the third part, the methods previously developed are applied to the study of different thermal one- and two-point functions in the hadronic phase, using chiral perturbation theory.

The book includes chosen exercises proposed at the end of each chapter and fully worked out. These are used to provide important side results or to develop calculations, without breaking the flow of the main text. Similarly, some of the results mentioned in the text are derived in a few appendices. It is a useful reference for graduate students interested in relativistic thermal field theory.

The post Hadrons at Finite Temperature appeared first on CERN Courier.

]]>
Review This monograph explains the ideas involved in the theoretical analysis of the data produced in heavy-ions collisions. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Book-mallik.jpg
Relativity In Modern Physics https://cerncourier.com/a/relativity-in-modern-physics/ Fri, 30 Nov 2018 14:02:13 +0000 https://preview-courier.web.cern.ch/?p=101294 This new advanced textbook on relativity aims to present all the different aspects of this brilliant theory and its applications.

The post Relativity In Modern Physics appeared first on CERN Courier.

]]>
By Nathalie Deruelle and Jean-Philippe Uzan
Oxford University Press

Relativity in Modern Physics

A century after its formulation by Einstein, the theory of general relativity is at the core of our interpretation of various astrophysical and cosmological observations – from neutron stars and black-hole formation to the accelerated expansion of the universe. This new advanced textbook on relativity aims to present all the different aspects of this brilliant theory and its applications. It brings together, in a coherent way, classical Newtonian physics, special relativity and general relativity, emphasising common underlying principles.

The book is structured in three parts around these topics. First, the authors provide a modern view of Newtonian theory, focusing on the aspects needed for understanding quantum and relativistic contemporary physics. This is followed by a discussion of special relativity, presenting relativistic dynamics in inertial and accelerated frames, and an overview of Maxwell’s theory of electromagnetism.

In the third part the authors delve into general relativity, developing the geometrical framework in which Einstein’s equations are formulated, and present many relevant applications, such as black holes, gravitational radiation and cosmology.

This book is aimed at undergraduate and graduate students, as well as researchers wishing to acquire a deeper understanding of relativity. But it could also appeal to the curious reader with a scientific background who is interested in discovering the profound implications of relativity and its applications.

The post Relativity In Modern Physics appeared first on CERN Courier.

]]>
Review This new advanced textbook on relativity aims to present all the different aspects of this brilliant theory and its applications. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Book-deruelle.jpg
Enjoy Our Universe, You Have No Other Choice https://cerncourier.com/a/101293-2/ Fri, 30 Nov 2018 13:59:08 +0000 https://preview-courier.web.cern.ch/?p=101293 Massimo Giovannini reviews in 2018 Enjoy Our Universe, You have No Other Choice.

The post Enjoy Our Universe, You Have No Other Choice appeared first on CERN Courier.

]]>
Enjoy Our Universe, You Have No Other Choice
By Alvaro De Rújula
Oxford University Press

Enjoy Our Universe

Scientific essays well suited to the interested layperson are notoriously difficult to write. It is then not surprising that various popular books, articles and internet sites recycle similar analogies – or even entire discussions – to explain scientific concepts with the same standardised, though very polished, language. CERN theorist Alvaro De Rújula recently challenged this unfortunate and relatively recent trend by proposing a truly original and unconventional essay for agile minds. There are no doubts that this book will be appreciated not only by the public but also by undergraduate students, teachers and active scientists.

Enjoy our Universe consists of 37 short chapters accounting for the serendipitous evolution of basic science in the last 150 years, roughly starting with the Faraday–Maxwell unification and concluding with the discovery of the Higgs boson and of gravitational waves. While going through the “fun” of our universe, the author describes the conceptual and empirical triumphs of classical and quantum field theories without indulging in excessive historic or technical details. Those who had the chance to attend lectures or talks given by De Rujula will recognise the “parentheses” (i.e. swift digressions) that he literally opens and closes in his presentations with gigantic brackets on the slides. A rather original glossary is included at the end of the text for the benefit of general readers.

This book is also a collection of opinions, reminiscences and healthy provocations of an active scientist whose contributions undeniably shaped the current paradigm of fundamental interactions. This is a bonus for practitioners of the field (and for curious colleagues), who will often find the essence of long-standing diatribes hidden in a collection of apparently innocent jokes or in the caption of a figure. As the author tries to argue in his introduction, science should always be discussed with that joyful and playful attitude we normally use when talking about sport and other interesting matters not immediately linked to the urgencies of daily life.

One of the most interesting subliminal suggestions of this book is that physics is not a closed logical system. Basic science in general (and physics in particular) can only prosper if the confusion of ideas is tolerated and encouraged, at least within certain reasonable limits.

The text is illustrated with drawings by the author himself and this aspect, among others, brings to mind an imaginative popular essay by George Gamow (Gravity 1962), where the author drew his own illustrations (unfortunately not in colour) with a talent comparable to De Rújula’s. The inspiration in this book is also a reminder of the autobiographical essay of Victor Weisskopf written almost 30 years ago, entitled The Joy of Insight, which echoes the enjoyment of the universe and suggests that the true motivation for basic science is the fun of curiosity: all the rest is irrelevant. So, please, enjoy our universe since you have no other choice!

The post Enjoy Our Universe, You Have No Other Choice appeared first on CERN Courier.

]]>
Review Massimo Giovannini reviews in 2018 Enjoy Our Universe, You have No Other Choice. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Book-ruula.jpg
The Pope of Physics: Enrico Fermi and the Birth of the Atomic Age https://cerncourier.com/a/the-pope-of-physics-enrico-fermi-and-the-birth-of-the-atomic-age/ Fri, 30 Nov 2018 13:52:01 +0000 https://preview-courier.web.cern.ch/?p=101290 Enrico Fermi can be considered as one of the greatest physicists of all time due to his genius creativity in both theoretical and experimental physics. This book describes his prodigious story, as a man and a scientist.

The post The Pope of Physics: Enrico Fermi and the Birth of the Atomic Age appeared first on CERN Courier.

]]>
By Gino Segrè and Bettina Hoerlin
Henry Holton and Co.

The Pope of Physics

Enrico Fermi can be considered as one of the greatest physicists of all time due to his genius creativity in both theoretical and experimental physics. This book describes his prodigious story, as a man and a scientist.

Born in Rome in 1901, Fermi spent the first part of his life in Italy, where he made his brilliant debut in theoretical physics in 1926 by applying statistical mechanics to atomic physics in a quantum framework, thus sealing the birth of what is now known as Fermi–Dirac statistics. In 1933 he postulated the original theory of weak interactions to explain the mysterious results on nuclear ß decays. Having soon become a theoretical “superstar”, he then switched to experimental nuclear physics, leading a celebrated team of young physicists at the University of Rome, known as the “boys”. Among them were Edoardo Amaldi, Ettore Majorana, Bruno Pontecorvo, Franco Rasetti and Emilio Segrè. They nicknamed him “the Pope” since he knew and understood everything and was considered to be simply infallible. His discoveries on neutron-induced radioactivity and on the neutron slowing-down effect earned him the Nobel Prize in Physics in 1938.

Those were, however, difficult years for Fermi because of Italy’s inconsistent research strategy and harsh political situation of fascism and antisemitism. Fermi left with his family to go to the US in December 1938, using the Nobel ceremony as a chance to travel abroad. Initially at Columbia University, Fermi then moved to the “Met Lab” of the University of Chicago, which was the seed of the Manhattan Project. There, he created the first self-sustained nuclear reactor in December 1942. The breakthrough ushered in the nuclear age, leaving a lasting impact on physics, engineering, medicine and energy – not to mention the development of nuclear weapons. In 1944 Fermi moved to the Manhattan Project’s secret laboratory in Los Alamos. Within this project, he collaborated with some of the world’s top scientists, including Hans Bethe, Niels Bohr, Richard Feynman, John von Neumann, Isidor Rabi, Leo Szilard and Edward Teller. These were terrible times of war.

When the Second World War concluded, Fermi resumed his research activities with energy and enthusiasm. On the experimental front he focused on nuclear physics, particle accelerators and technology, and early computers. On the theoretical front he concentrated on the origin of extreme high-energy cosmic rays. He also campaigned on the peaceful use of nuclear physics. As in Rome, in Chicago he was also the master of a wonderful school of pupils, among whom were several Nobel laureates. Fermi sadly died prematurely in 1954.

This book is about the epic life of Fermi, mostly known to the general public for the first ever nuclear reactor and the Manhattan Project, but to scientists for his theoretical and experimental discoveries – all diverse and crucial in modern physics – which always resulted in major advances. He remains less known as a personality or a public figure, and his scientific legacy is somehow underestimated. The merit of this book is therefore to bring Fermi’s genius within everyone’s reach.

Many renowned texts have been dedicated to Fermi until now, offering various perspectives on his life and his work. First of all on the personal life of Fermi, there is Atoms in the Family (1954) by his widow, Laura. Exhaustive information about Fermi’s outstanding works in physics can be found in the volume Enrico Fermi, Physicist (1970) by his friend and colleague Emilio Segrè, Nobel laureate and Gino Segrè’s uncle, and in Enrico Fermi: Collected Papers, two volumes published in the 1960s by the University of Chicago. Also worth mentioning are: Fermi Remembered (2004), edited by Nobel laureate James W Cronin; Enrico Fermi: His Work and Legacy (2001, then 2004), edited by C Bernardini and L Bonolis, and The Lost Notebook of Enrico Fermi by F Guerra and N Robotti (2015, then 2017), both published by the Italian Physical Society–Springer. Finally, published almost at the same time as Segrè and Hoerlin’s book, is another biography of Fermi: The Last Man Who Knew Everything by D N Schwartz, the son of Nobel laureate Melvin Schwartz. In their “four-handed” book, Segrè and Hoerlin have highlighted with expertise the scientific biography of Fermi and his extraordinary achievements, and described with emotion the human, social and political aspects of his life.

Readers familiar with Fermi’s story will enjoy this book, which is as scientifically sound as a textbook but at the same time bears the gripping character of a novel.

The post The Pope of Physics: Enrico Fermi and the Birth of the Atomic Age appeared first on CERN Courier.

]]>
Review Enrico Fermi can be considered as one of the greatest physicists of all time due to his genius creativity in both theoretical and experimental physics. This book describes his prodigious story, as a man and a scientist. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Book-segre.jpg
Inside Story: On the Courier’s new future https://cerncourier.com/a/inside-story-on-the-couriers-new-future/ Fri, 30 Nov 2018 09:00:53 +0000 https://preview-courier.web.cern.ch/?p=12977 “I think the Courier is excellent; it’s sort of ‘frozen in time’, but in a rather appropriate and appealing way.”

The post Inside Story: On the Courier’s new future appeared first on CERN Courier.

]]>
“I think the Courier is excellent; it’s sort of ‘frozen in time’, but in a rather appropriate and appealing way.” Of all the lively comments received from the 1400 or so readers who took part in our recent survey (see below), this one sums things up for the CERN Courier. “Excellent” might be a stretch for some, but, coming up for its 60th anniversary, this well-regarded periodical is certainly unique. It has been alongside high-energy physics as the field has grown up, from the rise of the Standard Model to the strengthening links with cosmology and astrophysics, the increasing scale and complexity of accelerators, detectors and computing, the move to international collaborations involving thousands of people, and other seismic shifts.

In terms of presentation, though, the Courier is indeed ripe for change. The website preview-courier.web.cern.ch was created in 1998 when the magazine’s production and commercial dimensions were outsourced to IOP Publishing in the UK. Updated only 10 times per year with the publication of each print issue, the website has had a couple of makeovers (one in 2007 and one earlier this year) but its functionality remains essentially unchanged for 20 years.

A semi-static, print-led website is no longer best suited to today’s publishing scene. The sheer flexibility of online publishing allows more efficient ways to communicate different stories to new audiences. Our survey concurs: a majority of readers (63%) indicated that they were willing to receive fewer print copies per year if preview-courier.web.cern.ch was updated more regularly – a view held most strongly among younger responders. It is this change to its online presence that drives the new publishing model of CERN Courier from 2019, with a new, dynamic website planned to launch in the spring.

At the same time, there is high value attached to a well-produced print magazine that worldwide readers can receive free of charge. And, as the results of our survey show, a large section of the community reads the Courier only when they pick up a copy in their labs or universities to browse over lunch or while travelling. That’s why the print magazine is staying, though at a reduced frequency of six rather than 10 issues per year. To reflect this change, the magazine will have a new look from next year. Among many improvements, we have adopted a more readable font, a clearer layout and other modern design features. There are new and revised sections covering careers, opinion and reviews, while the feature articles – the most popular according to our survey – will remain as the backbone of the issue.

It is sometimes said that the Courier can be a bit too formal, a little dry. Yet our survey did not reveal a huge demand to lighten things up – so don’t expect to see Sudoku puzzles or photos of your favourite pet any time soon. That said, the Courier is a magazine, not an academic journal; in chronicling progress in global high-energy physics it strives to be as enjoyable as it is authoritative.

Another occasional criticism is that the Courier is a mere mouthpiece for CERN. If it is, then it is also – and unashamedly – a mouthpiece for other labs and for the field as a whole. Within just a few issues of its publication, the Courier outgrew its original editorial remit and expanded to cover activities at related laboratories worldwide (with the editorially distinct CERN Bulletin serving the internal CERN community). The new-look Courier will also retain an important sentence on its masthead demarcating the views stated in the magazine from those of CERN management.

A network of around 30 laboratory correspondents helps to keep the magazine updated with news from their facilities on an informal basis. But the more members of the global high-energy physics community who interact, the better the Courier can serve them. Whether it’s a new result, experiment, machine or theorem, an event, appointment or prize, an opinion, review or brazen self-promotion, get in touch at cern.courier@cern.ch.

Reader survey: the results are in

To shape the Courier’s new life in print and online, a survey was launched this summer in conjunction with IOP Publishing to find out what readers think of the magazine and website, and what changes could be made. The online survey asked 21 questions and responders were routed to different sections of the survey depending on the answers they provided. Following promotion on preview-courier.web.cern.ch, CERN’s website, CERN Bulletin, social media channels and e-mails to CERN users, there were a total of 1417 responses.

Chart showing age of survey responders

Responders were split roughly 3:1 male to female, with a fairly even age distribution. Geographically, they were based predominantly in France, the US, Italy, Switzerland, Germany and the UK. Some 43% of the respondents work at a university, followed by a national or international research institute (34%), with the rest working in teaching (5%) and various industries. While three-quarters of the respondents named experimental particle physics as their main domain of work, many have other professional interests ranging from astronomy to marketing.

Responders were evenly split between those that read the printed magazine and those that don’t. Readers tend to read the magazine on a regular basis and, overall, have been reading for a significant period of time. A majority (54.1%) do not read the magazine via a direct subscription, and the data suggest that one copy of the Courier is typically read by more than one person.

Graph showing professional positions

In terms of improving the CERN Courier website, there was demand for a mobile-optimised platform and for video content, though a number of respondents were unaware that the website even existed. Importantly for the future of CERN Courier, a majority of readers (63%) indicated that they were willing to receive fewer print copies per year if preview-courier.web.cern.ch was updated more regularly; this trend was sharpest in the under-30 age group.

When it comes to the technical level of the articles, which is a topic of much consideration at the Courier, the responses indicate that the level is pitched just right (though, clearly, a number of readers will find some topics tougher than others given the range of subfields within high-energy physics). Readers also felt that their fields were well represented, and agreed that more articles about careers and people would be of interest.

Graph showing professional interests

Many written comments were provided, a few of which are listed here: “More investigative articles please”; “I would like that it has a little glossary”; “A column about people themselves, not only the physics they do”; “More debate on topics on which there is discussion in the field”; “Please do NOT modify CERN Courier into a ‘posher’ version”; “Leave out group photos of people at big meetings”; and “Make a CERN Courier kids edition”. The overwhelming majority of comments were positive, and the few that weren’t stood out: “The whole magazine reads like propaganda for CERN and for the Standard Model”; “The Courier style is intentionally humourless, frigid, stale and boring. Accordingly, almost everybody agrees that the obituaries are by far its best part”; and, curiously, “The actual format is so boring that I stop to read it!”

It only remains to thank participants of the survey and to congratulate the winners of our random prize draw (V Boudry, J Baeza, S Clawson, V Lardans and M Calvetti), who each receive a branded CERN hoodie.

The post Inside Story: On the Courier’s new future appeared first on CERN Courier.

]]>
Feature “I think the Courier is excellent; it’s sort of ‘frozen in time’, but in a rather appropriate and appealing way.” https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Inside-collage.png
The tale of a billion-trillion protons https://cerncourier.com/a/the-tale-of-a-billion-trillion-protons/ Fri, 30 Nov 2018 09:00:39 +0000 https://preview-courier.web.cern.ch/?p=12959 Linac2, the machine that feeds CERN’s accelerator complex with protons, has entered a well-deserved retirement after 40 years of service.

The post The tale of a billion-trillion protons appeared first on CERN Courier.

]]>

Before being smashed into matter at high energies to study nature’s basic laws, protons at CERN begin their journey rather uneventfully, in a bottle of hydrogen gas. The protons are separated by injecting the gas into the cylinder of an ion source and making an electrical discharge, after which they enter what has become the workhorse of CERN’s proton production for the past 40 years: a 36 m-long linear accelerator called Linac2. Here, the protons are accelerated to an energy of 50 MeV, reaching approximately one-third of the speed of light, ready to be injected into the first of CERN’s circular machines: the Proton Synchrotron Booster (PSB), followed by the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS). At each stage of the chain, they may end up driving fixed-target experiments, generating exotic beams in the ISOLDE facility, or being injected into the Large Hadron Collider (LHC) to be accelerated to the highest energies.

Situated at ground level on the main CERN site, Linac2 has delivered all of the protons for the CERN accelerator complex since 1978. Construction of Linac2 started in December 1973, and the first 50 MeV beam was obtained on 6 September 1978. Within a month, the design current of 150 mA was reached and the first injection tests in the PSB started. Routine operation of the PSB started soon afterwards, in December 1978. As proudly announced by CERN at the time, Linac2 was completed on budget and on schedule, for an overall cost of 23 million Swiss francs.

Linac2 is the machine that started more than a billion-trillion protons on trajectories that led to discoveries including the W and Z bosons, the creation of antihydrogen and the completion of the long search for the Higgs boson. On 12 November, Linac2 was switched off and will now be decommissioned as part of a major upgrade to the laboratory’s accelerator complex (CERN Courier October 2017 p32). Its design, operation and performance have been key factors in the success of CERN’s scientific programme and paved the way to its successor, Linac4, which will take over the task of producing CERN’s protons from 2020.

The decision to build Linac2 was taken in October 1973, with the aim to provide a higher-intensity proton beam compared to the existing Linac1 machine. Linac1 had been the original injector both to the PS when it began service in 1959, and to its booster (the PSB) when it was added to the chain in 1972. However, Linac1 was limited in the intensity it could provide, and the only way to higher intensity was for an entirely new construction.

Forward thinking

Linac2’s design parameters were chosen to comfortably exceed the nominal PSB requirements, providing a safety margin during operation and for future upgrades. Furthermore, it was decided to install the linac in a new building parallel to the Linac1 location instead of in the Linac1 tunnel. This avoided a long shut-down for installation and commissioning, and ensured that Linac1 was available as a back-up during the first years of Linac2 operation.

Linac2’s proton source was originally a huge 750 kV Cockcroft–Walton generator located in a shielded room, separate from the accelerator hall (figure 1), which provided the pre-acceleration to the entrance of the 4 m-long low energy beam transport line (LEBT). This transport line included a bunching system made of three RF cavities, after which protons were fed to the main accelerator: a drift-tube linac (DTL) that had many improvements with respect to the Linac1 design and became a standard for linacs at the time. The three accelerating RF “tanks”, increasing the beam energy up to 10.3, 30.5 and 50 MeV, respectively, with a total length of 33.3 m, were made of mild steel co-laminated with a copper sheet, with the vacuum and RF sealing provided by aluminium wire joints.

The RF system is of prime importance for the performance of linear accelerators. For Linac2, the amplifiers had to provide a total RF power of 7.5 MW just to accelerate the beam. The RF amplifiers were based on the Linac1 design principles, with larger diameters in order to safely deliver the higher power, and the RF tube was the same triode already used for most of the Linac 1 amplifiers.

The most significant upgrade to Linac2, which took place during the 1992/1993 shutdown, was the replacement of the 750 kV Cockcroft–Walton generator and of the LEBT with a new RF quadrupole (RFQ) only 1.8 m long, capable of bunching, focusing and accelerating the beam in the same RF structure. The RFQ was a new invention of the early 1980s that was immediately adopted at CERN: after the successful construction of a prototype RFQ for Linac1 (which at the time was still in service), the development of a record-breaking high-intensity RFQ for Linac2, capable of delivering to the DTL a current of 200 mA, started in 1984. The prototype high-current RFQ was commissioned on a test stand in 1989, and the replacement of the Linac2 pre-injector was officially approved in 1990.

Gearing up for the LHC

The main motivation for the higher current of Linac2 was to prepare the CERN injectors for the LHC, which was already in progress. It was clear that the LHC would require unprecedented beam brightness (intensity per emittance) from the injector chain, and one of the options considered was to go to single-turn injection into the PSB of a high-current linac beam to minimise emittance growth. This, in turn, required the highest achievable current from the linac. Another motivation for the replacement was the simpler operation and maintenance of the smaller RFQ compared with the large Cockcroft–Walton installation.

Construction of the new RFQ (figure 2) started soon after approval, and the new “RFQ2” system was installed at Linac2 during the normal shut-down in 1992/1993. Commissioning of the RFQ2 with Linac2 took a few weeks, and the 1993 physics run started with the new injector. Reaching the full design performance of the RFQ took a few years, mainly due to the slow cleaning of the surfaces that at first limited the peak RF fields possible inside the cavity. After the optics in the long transfer line were modified, the goal of 180 mA delivered to the PSB was achieved in 1998 – and this still ranks as the highest intensity proton beam ever achieved from a linac.

Throughout its life, Linac2 has undergone many upgrades to its subsystems, including major renovations of the control systems in 1993 and 2012, the exchange of more than half its magnet power supplies to more modern units (although a large number were still the same ones installed in the 1970s) and renovation of the RFQ and vacuum-control systems. Nevertheless, at its core, the three DTL RF cavities that are the backbone of the linac remained unchanged since their construction, as were more than 120 electromagnetic quadrupoles sealed in the drift tubes that have each pulsed more than 700 million times without a single magnet failure (figure 3).

Despite the performance and reliability of Linac2, the performance bottleneck of the injection chain for the LHC moved to the injection process of the PSB, which could only be resolved with a higher injection energy. This meant increasing the energy of the linac. At the time this was being considered, around a decade ago, Linac2 was already reaching 30 years of operation, and basing a new injector on it would have required a major consolidation effort. So the decision was made to move to a new accelerator called Linac4 (the name Linac3 is taken by an existing CERN linac that produces ions), which meant a clean slate for its design. Linac4 (figure 4) not only injects into the PSB at the higher energy of 160 MeV, but also switches to negative hydrogen-ion beam acceleration, which allows higher intensities to be accumulated in the PSB after removing the excess electrons.

As was the case when Linac2 took over from Linac1, Linac4 has been built in its own tunnel, allowing construction and commissioning to take place in parallel to the operation of Linac2 for the LHC (CERN Courier January/February 2018 p19). In connecting Linac4 to the PSB, some of the Linac2 transfer line will be dismantled to make space for additional shielding. But the original source, RFQ and three DTL cavities will remain in place for now – even if there is no possibility of their serving as a back-up once the change to Linac4 is made. As for the future of Linac2, hopefully you might one day be able to find part of the accelerator on display somewhere on the CERN site, so that its place in history is not forgotten.

The post The tale of a billion-trillion protons appeared first on CERN Courier.

]]>
Feature Linac2, the machine that feeds CERN’s accelerator complex with protons, has entered a well-deserved retirement after 40 years of service. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Linac_frontis.png
Foundations of High-Energy-Density Physics: Physical Processes of Matter at Extreme Conditions https://cerncourier.com/a/foundations-of-high-energy-density-physics-physical-processes-of-matter-at-extreme-conditions/ Mon, 29 Oct 2018 14:07:47 +0000 https://preview-courier.web.cern.ch/?p=101302 This book provides a comprehensive overview of high-energy-density physics (HEDP), which concerns the dynamics of matter at extreme temperatures and densities.

The post Foundations of High-Energy-Density Physics: Physical Processes of Matter at Extreme Conditions appeared first on CERN Courier.

]]>
By Jon Larsen
Cambridge University Press

This book provides a comprehensive overview of high-energy-density physics (HEDP), which concerns the dynamics of matter at extreme temperatures and densities. Such matter is present in stars, active galaxies and planetary interiors, while on Earth it is not found in normal conditions, but only in the explosion of nuclear weapons and in laboratories using high-powered lasers or pulsed-power machines.

After introducing, in the first three chapters, many fundamental physics concepts necessary to the understanding of the rest of the book, the author delves into the subject, covering many key aspects: gas dynamics, ionisation, the equation-of-state description, hydrodynamics, thermal energy transport, radiative transfer and electromagnetic wave–material interactions.

The author is an expert in radiation-hydrodynamics simulations and is known for developing the HYADES code, which is largely used among the HEDP community. This book can be a resource for research scientists and graduate students in physics and astrophysics.

The post Foundations of High-Energy-Density Physics: Physical Processes of Matter at Extreme Conditions appeared first on CERN Courier.

]]>
Review This book provides a comprehensive overview of high-energy-density physics (HEDP), which concerns the dynamics of matter at extreme temperatures and densities. https://cerncourier.com/wp-content/uploads/2022/06/51tspgcLZqL._SX348_BO1204203200_.jpg
Quantized Detector Networks: The Theory of Observation https://cerncourier.com/a/quantized-detector-networks-the-theory-of-observation/ Mon, 29 Oct 2018 14:07:47 +0000 https://preview-courier.web.cern.ch/?p=101303 Quantised Detector Networks (QDN) theory was invented to reduce the level of metaphysics in the application of quantum mechanics (QM), moving the focus from the system under observation to the observer and the measurement apparatuses.

The post Quantized Detector Networks: The Theory of Observation appeared first on CERN Courier.

]]>
By George Jaroszkiewicz
Cambridge University Press

Quantised Detector Networks (QDN) theory was invented to reduce the level of metaphysics in the application of quantum mechanics (QM), moving the focus from the system under observation to the observer and the measurement apparatuses. This approach is based on the consideration that “labstates”, i.e. the states of the system we use for observing, are the only things we can actually deal with, while we have no means to prove that the objects under study “exist” independently of observers or observations.

In this view, QM is not a theory describing objects per se, but a theory of entitlement, which means that it provides physicists with a set of rules defining what an observer is entitled to say in any particular context.

The book is organized in four parts: Basics, Applications, Prospects, and Appendices. The author provides, first of all, the formalism of QDN and then applies it to a number of experiments that show how it differs from standard quantum formalism. In the third part, the prospects for future applications of QDN are discussed, as well as the possibility of constructing a generalised theory of observation. Finally, the appendices collect collateral material referred to at various places in the book.

The aim of the author is to push the readers to look in a different way at the world they live in, to show them the cognitive traps caused by realism – i.e. the assumption that what we observe has an existence independent of our observation – and alerting them that various speculative concepts and theories discussed by some scientists do not actually have empirical basis. In other words, they cannot be experimentally tested.

The post Quantized Detector Networks: The Theory of Observation appeared first on CERN Courier.

]]>
Review Quantised Detector Networks (QDN) theory was invented to reduce the level of metaphysics in the application of quantum mechanics (QM), moving the focus from the system under observation to the observer and the measurement apparatuses. https://cerncourier.com/wp-content/uploads/2022/06/41oDdzI0zfL._SX355_BO1204203200_.jpg
The Great Silence – The Science and Philosophy of Fermi’s Paradox https://cerncourier.com/a/the-great-silence-the-science-and-philosophy-of-fermis-paradox/ Mon, 29 Oct 2018 14:07:46 +0000 https://preview-courier.web.cern.ch/?p=101301 Andrea Giammanco reviews in 2018 The Great Silence - The Science and Philosophy of Fermi's Paradox.

The post The Great Silence – The Science and Philosophy of Fermi’s Paradox appeared first on CERN Courier.

]]>
By Milan Cirkovic
Oxford University Press

Enrico Fermi formulated his eponymous paradox during a casual lunchtime chat with colleagues in Los Alamos: the great physicist argued that, probabilistically, intelligent extraterrestrial lifeforms had time to develop countless times in the Milky Way, and even to travel across our galaxy multiple times; but if so, where are they?

The author of this book, Milan Cirkovic, claims that, with the wealth of scientific knowledge accumulated in the many decades since then, the paradox is now even more severe. Space travel is not speculative anymore, and we know that planetary systems are common – including Earth-like planets – suggesting that life on our planet started very early and that our solar system is a relative late-comer on the cosmic scene; hence, we should expect many civilisations to have evolved way beyond our current stage. Given the huge numbers involved, Cirkovic remarks, the paradox would not even be completely solved by the discovery of another civilisation: we would still have to figure out where all others are!

The Great Silence aims at an exhaustive review of the solutions proposed to this paradox in the literature (where “literature” is to be understood in the broadest sense, ranging from scholarly astrobiology papers to popular-science essays to science-fiction novels), following a rigorous taxonomic approach. Cirkovic’s taxonomy is built from the analysis of which philosophical assumptions create the paradox in the first place. Relaxing the assumptions of realism, Copernicanism, and gradualism leads, respectively, to the families of solutions that Cirkovic labels “solipsist”, “rare Earth”, and “neocatastrophic”. His fourth and most heterogeneous category of solutions, labelled “logistic”, arises from considering possible universal limitations of physical, economic or metabolic nature.

The book starts by setting a rigorous foundation for discussion, summarising the scientific knowledge and dissecting the philosophical assumptions. Cirkovic does not seem interested in captivating the reader from the start: the preface and the first three chapters are definitely scholarly in their intentions, and assume that the reader already knows a great deal about Fermi’s paradox. As a particularly egregious example, Kardashev’s speculative classification of civilisations, based on the scale of their energy consumption, plays a very important role in this book; one would have therefore expected a discussion about that, somewhere at the beginning. Instead, the interested reader has to resort to a footnote for a succinct definition of the three types of civilisation (Type I: exploiting planetary resources; Type II: using stellar system resources; Type III: using galactic resources).

However, after these introductory chapters, Cirkovic’s writing becomes very pleasant and engaging, and his reasoning unfolds clearly. Chapters four to seven are the core of the book, each of them devoted to the solutions allowed by negating one assumption. Every chapter starts with an analogy with a masterpiece in cinema or literature, followed by a rigorous philosophical definition. Then, the consequent solutions to Fermi’s paradox are reviewed and, finally, a résumé of take-home messages is provided.

This parade of solutions gives a strange feeling: each of them sounds either crazy, or incredibly unlikely, or insufficient to solve the paradox (at least in isolation). Still, once we accept Cirkovic’s premise that Fermi’s paradox means that some deeply rooted assumption cannot be valid, we are compelled to take seriously some outlandish hypothesis. The reader is invited to ponder, for example, how the solution to the paradox might depend on the politics of the Milky Way in the last few billion years: extraterrestrial civilisations may have all converged to a Paranoid Style in Galactic Politics, or we might unknowingly be under the jurisdiction of an Introvert Big Brother (Cirkovic has a talent for catchy titles). Some Great Old Ones might be temporarily asleep, or we (and any conceivable biological intelligence) might be limited in our evolution by some Galactic Stomach-Ache. A large class of very gloomy hypotheses assumes that all our predecessors were wiped out before reaching very high Kardashev’s scores, and Cirkovic seems particularly fond of the idea of swarms of Deadly Probes that may still be roaming around, ready to point at us as soon as they notice our loudness. Unless we reach the aforementioned state of galactic paranoia, which makes for a very nice synergy between two distinct solutions of the paradox.

The author not only classifies the proposed solutions, but also rates them by how fully they would solve this paradox. The concluding chapter elaborates on several philosophical challenges posed by Fermi’s paradox, in particular to Copernicanism, and on the link between it and the future of humanity.

Cirkovic is a vocal (and almost aggressive) critic of most of the SETI-related literature, claiming that it relies on excessive assumptions which strongly limits SETI searches. In his words, the failure of SETI so far has mostly occurred on philosophical and methodological levels. He quotes Kardashev in saying that extraterrestrial civilisations have not been found because they have not really been searched for. Hence Cirkovic’s insistence on a generalisation of targets and search methods.

An underlying theme in this book is the relevance of philosophy for the advancement of science, in particular when a science is in its infancy, as he argues to be the case for astrobiology. Cirkovic draws an analogy with early 20th century cosmology, including a similitude between Fermi’s and Olmert’s paradoxes (the latter being: how can the night sky be dark, if we are reachable by the light of an infinite number of stars in an infinitely old universe?).

I warmly recommend The Great Silence to any curious reader, in spite of its apparent disinterest for a broad readership. In it, Cirkovic makes a convincing case that Fermi’s paradox is a fabulously complex and rich intellectual problem.

The post The Great Silence – The Science and Philosophy of Fermi’s Paradox appeared first on CERN Courier.

]]>
Review Andrea Giammanco reviews in 2018 The Great Silence - The Science and Philosophy of Fermi's Paradox. https://cerncourier.com/wp-content/uploads/2022/06/41357fxfItL.jpg
Strange Glow: The Story of Radiation https://cerncourier.com/a/strange-glow-the-story-of-radiation/ Mon, 29 Oct 2018 14:07:39 +0000 https://preview-courier.web.cern.ch/?p=101300 In this book, Timothy Jorgensen, a professor of radiation medicine at Georgetown University in the US, recounts the story of the discovery of radioactivity and how mankind has been transformed by it, with the aim of sweeping away some of the mystery and misunderstanding that surrounds radiation.

The post Strange Glow: The Story of Radiation appeared first on CERN Courier.

]]>
By Timothy J Jorgensen
Princeton University Press

CCNov18_Book-jorgensen

In this book, Timothy Jorgensen, a professor of radiation medicine at Georgetown University in the US, recounts the story of the discovery of radioactivity and how mankind has been transformed by it, with the aim of sweeping away some of the mystery and misunderstanding that surrounds radiation.

The book is structured in three parts. The first is devoted to the discovery of ionising radiation in the late 19th century and its rapid application, notably in the field of medical imaging. The author establishes a vivid parallel with the discovery and exploitation of radio waves, a non-ionising counterpart of higher energy X rays. A dynamic narrative, peppered with personal anecdotes by key actors, succeeds in transmitting the decisive scientific and societal impact of radiation and related discoveries. The interleaving of the history of the discovery with aspects of the lives of inspirational figures such as Ernest Rutherford and Enrico Fermi is certainly very relevant, attractive and illustrative.

In the second part, the author focuses on the impact of ionising radiation on human health, mostly through occupational exposure in different working sectors. A strong focus is on the case of the “radium girls” – female factory workers who were poisoned by radiation from painting watch dials with self-luminous paint. This section also depicts the progress in radiation-protection techniques and the challenges related to quantifying the effects of radiation and establishing limits for the exposure to it. The text succeeds in outlining the difficulties of linking physical quantities of radiation with its impact on human health.

The risk assessment related to radiation exposure and its impact on human health is further covered in the third part of the book. Here, Jorgensen aims to provide quantitative tools for the public to be able to evaluate the benefits and risks associated with radiation exposure. Despite his effort to offer a combination of complementary statistical approaches, readers are left with an impression that many aspects of the impact of radiation on human health are not fully understood. On the contrary, the large number of radiation-exposure cases in the Hiroshima and Nagasaki nuclear bombings, after which it was possible to correlate the absorbed dose with the location of the various victims at the time of the explosion, provides a scientifically valuable sample to study both deterministic and stochastic effects of radiation on human health.

In part three, the book also digresses at length about the role of nuclear weapons in the US defence and geopolitical strategy. This topic seems somewhat misplaced with respect to the more technical and scientific content of the rest of the text. Moreover, it is highly US-centric, often neglecting the analogous role of such weapons in other countries.

It is noteworthy that the book does not cover radiation in space and its crucial impact on human spaceflight. Likewise, the discovery of cosmic radiation through Hess’ balloon experiment in 1911–1912, while constituting an essential finding in addition to the already discovered radioactivity from elements on the Earth’s surface, is completely overlooked.

Despite the lack of space-radiation coverage and the somewhat uncorrelated US defence considerations, this book is definitely a very good read that will satisfy the reader’s curiosity and interest with respect to radiation and its impact on humans. In addition, it provides insight into the more general progress of physics, especially in the first half of the 19th century, in a highly dynamic and entertaining manner.

The post Strange Glow: The Story of Radiation appeared first on CERN Courier.

]]>
Review In this book, Timothy Jorgensen, a professor of radiation medicine at Georgetown University in the US, recounts the story of the discovery of radioactivity and how mankind has been transformed by it, with the aim of sweeping away some of the mystery and misunderstanding that surrounds radiation. https://cerncourier.com/wp-content/uploads/2018/10/CCNov18_Book-jorgensen.jpg
Deep physics brought to life through photography https://cerncourier.com/a/deep-physics-brought-to-life-through-photography/ Mon, 29 Oct 2018 09:00:26 +0000 https://preview-courier.web.cern.ch/?p=12864 the wining entries of the 2018 Global Physics Photowall competition

The post Deep physics brought to life through photography appeared first on CERN Courier.

]]>

The 2018 Global Physics Photowalk brought hundreds of amateur and professional photographers to 18 laboratories around the world, including CERN, to capture their scientific facilities and workforce. The science of the participating labs ranges from exploring the origins of the cosmos to understanding our planet’s climate, and from improving human and animal health to helping deliver secure and sustainable food and energy supplies for the future.

Following local competitions, each lab submitted its top three images to the global competition. A public online vote chose the top three from those images, and a jury of expert photographers and scientists also picked their three favourites. The photowalk was organised by the Interactions collaboration, and was supported by the Royal Photographic Society and Association of Science-Technology Centers (ASTC). The winning entries, shown here, were announced on 30 September at the ASTC annual conference in Hartford, Connecticut.

Simon Wright bagged first place in the expert jury’s choice with this shot taken at the UK’s STFC Boulby Underground Laboratory, which is located 1.1 km underground in Europe’s deepest operating mine and contributes to the search for dark matter. The photograph captures STFC’s Tamara Leitan as she scanned an information board at the lab. To highlight Leitan’s face, Wright used a miner’s lamp instead of a flash to minimise interference with light reflected from the safety equipment that workers must wear at the mine.

Simon Wright received another award, this time third prize in the people’s choice category, for this image of green fluorescent lighting at an underground tunnel at the UK’s STFC Chibolton Observatory, which is home to a wide range of science facilities.

Jon McRae took third place in the expert jury’s selection, as well as second place in the people’s choice, for this photo of the DESCANT neutron detector at Canada’s TRIUMF laboratory. The detector can be mounted on the TIGRESS and GRIFFIN experiments to study nuclear structure. Holding a small, spherical lens between the camera and the detector array, McRae recreated a miniature simulacrum of DESCANT in the crystal-clear glass ball.

Stefano Ruzzini won the expert jury’s second prize for this photograph of a silicon-strip particle detector, which was first used in CERN’s NA50 experiment but is now at Italy’s INFN Frascati National Laboratories. The photo was praised by the judges for portraying the three-dimensional aspect of the detector.

This picture from Gianluca Micheletti was also awarded third place in the expert jury’s selection. It shows a researcher observing the XENON1T dark-matter experiment at Italy’s INFN Gran Sasso National Laboratories. The judges commended Micheletti’s composition of the image in evoking the sense of curiosity at the heart of physics.

Luca Riccioni snapped a picture of the KLOE-2 experiment at Italy’s INFN Frascati National Laboratories, which recently concluded its data-taking campaign at the DAΦNE electron–positron collider. The photograph was awarded first place in the people’s choice category.

The post Deep physics brought to life through photography appeared first on CERN Courier.

]]>
Feature the wining entries of the 2018 Global Physics Photowall competition https://cerncourier.com/wp-content/uploads/2018/10/CCNov18_PHOTO-Wright_Boulby.jpg
Nanoelectronics: Materials, Devices, Applications (2 volumes) https://cerncourier.com/a/nanoelectronics-materials-devices-applications-2-volumes/ Fri, 28 Sep 2018 14:37:28 +0000 https://preview-courier.web.cern.ch/?p=101327 This book aims to provide an overview of both present and emerging nanoelectronics devices, focusing on their numerous applications such as memories, logic circuits, power devices and sensors.

The post Nanoelectronics: Materials, Devices, Applications (2 volumes) appeared first on CERN Courier.

]]>
By R Puers, L Baldi, M Van de Voorde and S E van Nooten (editors)
Wiley–VCH

Nanoelectronics: Materials, Devices, Applications

This book aims to provide an overview of both present and emerging nanoelectronics devices, focusing on their numerous applications such as memories, logic circuits, power devices and sensors. It is one unit (in two volumes) of a complete series of books that are dedicated to nanoscience and nanotechnology, and their penetration in many different fields, ranging from human health, agriculture and food science, to energy production, environmental protection and metrology.

After an introduction about the semiconductor industry and its development, different kinds of devices are discussed. Specific chapters are also dedicated to new materials, device-characterisation techniques, smart manufacturing and advanced circuit design. Then, the many applications are covered, which also shows the emerging trends and economic factors influencing the progress of the nanoelectronics industry.

Since nanoelectronics is nowadays fundamental for any science and technology that requires communication and information processing, this book can be of interest to electronic engineers and applied physicists working with sensors and data-processing systems.

The post Nanoelectronics: Materials, Devices, Applications (2 volumes) appeared first on CERN Courier.

]]>
Review This book aims to provide an overview of both present and emerging nanoelectronics devices, focusing on their numerous applications such as memories, logic circuits, power devices and sensors. https://cerncourier.com/wp-content/uploads/2018/10/CCOct18Book-Voorde.jpg
Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning https://cerncourier.com/a/picturing-quantum-processes-a-first-course-in-quantum-theory-and-diagrammatic-reasoning/ Fri, 28 Sep 2018 14:37:13 +0000 https://preview-courier.web.cern.ch/?p=101326 This book is about telling the story of quantum theory entirely in terms of pictures.

The post Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning appeared first on CERN Courier.

]]>
By Bob Coecke and Aleks Kissinger
Cambridge University Press

Picturing Quantum Processes

“This book is about telling the story of quantum theory entirely in terms of pictures,” declare the authors of this unusual book, in which quantum processes are explained using diagrams and an innovative method for presenting complex theories is set up. The book employs a unique formalism developed by the authors, which allows a more intuitive understanding of quantum features and eliminates complex calculations. As a result, knowledge of advanced mathematics is not required.

The entirely diagrammatic presentation of quantum theory proposed in this (bulky) volume is the result of 10 years of work and research carried out by the authors and their collaborators, uniting classical techniques in linear algebra and Hilbert spaces with cutting-edge developments in quantum computation and foundational QM.

An informal and entertaining style is adopted, which makes this book easily approachable by students at their first encounter with quantum theory. That said, it will probably appeal more to PhD students and researchers who are already familiar with the subject and are interested in looking at a different treatment of this matter. The text is also accompanied by a rich set of exercises.

The post Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning appeared first on CERN Courier.

]]>
Review This book is about telling the story of quantum theory entirely in terms of pictures. https://cerncourier.com/wp-content/uploads/2018/10/CCOct18Book-Coecke.jpg
Essential Quantum Mechanics for Electrical Engineers https://cerncourier.com/a/essential-quantum-mechanics-for-electrical-engineers/ Fri, 28 Sep 2018 14:36:17 +0000 https://preview-courier.web.cern.ch/?p=101323 The aim of the author was to provide a concise book in which both the basic concepts of QM and its most relevant applications to electronics and information technologies are covered making use of only the very essential mathematics.

The post Essential Quantum Mechanics for Electrical Engineers appeared first on CERN Courier.

]]>
By Peter Deák
Wiley–VCH

Essential Quantum Mechanics for Electrical Engineers

The most recent and upcoming developments of electronic devices for information technology are increasingly being based on physical phenomena that cannot be understood without some knowledge of quantum mechanics (QM). In the new hardware, switching happens at the level of single electrons and tunnelling effects are frequently used; in addition, the superposition of electron states is the foundation of quantum information processing. As a consequence, the study of QM, as well as informatics, is now being introduced in undergraduate electric and electronic engineering courses. However, there is still a lack of textbooks on this subject written specifically for such courses.

The aim of the author was to fill this gap and provide a concise book in which both the basic concepts of QM and its most relevant applications to electronics and information technologies are covered, making use of only the very essential mathematics.

The book starts off with classical electromagnetism and shows its limitations when it comes to describing the phenomena involved in modern electronics. More advanced concepts are then gradually introduced, from wave–particle duality to the mathematical construction used to describe the state of a particle and to predict its properties. The quantum well and tunnelling through a potential barrier are explained, followed by a few applications, including light-emitting diodes, infrared detectors, quantum cascade lasers, Zener diodes, flash memories and the scanning tunnelling microscope. Finally, the author discusses some of the consequences of QM for the chemical properties of atoms and other many-electron systems, such as semiconductors, as well as the potential hardware for quantum information processing.

Even though the mathematical formulation of basic concepts is introduced when required, the author’s approach is oriented at limiting calculations and abstraction in favour of practical applications. Applets, accessible on the internet, are also used as a support, to ease the computational work and quickly visualise the results.

The post Essential Quantum Mechanics for Electrical Engineers appeared first on CERN Courier.

]]>
Review The aim of the author was to provide a concise book in which both the basic concepts of QM and its most relevant applications to electronics and information technologies are covered making use of only the very essential mathematics. https://cerncourier.com/wp-content/uploads/2018/10/CCOct18Book-Deak.jpg
Third Thoughts https://cerncourier.com/a/third-thoughts/ Fri, 28 Sep 2018 14:36:16 +0000 https://preview-courier.web.cern.ch/?p=101322 Letizia Diamante reviews in 2018 Third Thoughts.

The post Third Thoughts appeared first on CERN Courier.

]]>
By Steven Weinberg
The Belknap Press of Harvard University Press

Third Thoughts

When Nobel laureates offer their point of view, people generally are curious to listen. Self-described rationalist, realist, reductionist and devoutly secular, Steven Weinberg has published a new book reflecting on current affairs in science and beyond. In Third Thoughts, he addresses themes that are of interest for both laypeople and researchers, such as the public funding of science.

Weinberg shared the Nobel Prize in Physics in 1979 for unifying the weak interaction and electromagnetism into the electroweak theory, the core of the Standard Model, and has made many other significant contributions to physics. At the same time, Weinberg has been and remains a keen science populariser. Probably his most famous work is the popular-science book The First Three Minutes, where he recounts the evolution of the universe immediately following the Big Bang.

Third Thoughts is his third collection of essays for non-specialist readers, following Lake Views (2009) and Facing Up (2001). In it are 25 essays divided into four themes: science history, physics and cosmology, public matters, and personal matters. Some are the texts of speeches, some were published previously in The New York Review of Books, and others are released for the first time.

The essays span subjects from quantum mechanics to climate change, from broken symmetry to cemeteries in Texas, and are pleasantly interspersed with his personal life stories. Like his previous collections, Weinberg deals with topics that are dear to him: the history of science, science spending, and the big questions about the future of science and humanity.

The author defines himself as an enthusiastic amateur in the history of science, albeit a “Whig interpreter” (meaning that he evaluates past scientific discoveries by comparing them to the current advancements – a method that irks some historians). Beyond that, his taste for controversy encourages him to cogitate over Einstein’s lapses, Hawking’s views, the weaknesses of quantum mechanics and the US government’s financing choices, among others.

Readers who are interested in US politics will find the section “Public matters” very thought-provoking. In particular, the essay “The crisis of big science” is based on a talk he gave at the World Science Festival in 2011 and later published in the New York Review of Books. He explains the need for big scientific projects, and describes how both cosmology and particle physics are struggling for governmental support. Though still disappointed by the cut of the Superconducting Super Collider (SSC) in the early 1990s, he is excited by the new endeavours at CERN. He reiterates his frank opinions against manned space flight, and emphasises how some scientific obstacles are intertwined in the historical panorama. In this way, Weinberg sets the cancellation of the SSC in a wider problematic context, where education, healthcare, transportation and law enforcement are under threat.

The author condenses the essence of what physicists have learnt so far about the laws of nature and why science is important. This is a book about asking the right questions, when time is ripened to look for the answers. He explains that the question “What is the world made of?” needed to wait for chemistry advances at the end of the 18th century. “What is the structure of the electron?” needed to wait for quantum mechanics. While “What is an elementary particle?” is still waiting for an answer.

The essays vary in difficulty, and some concepts and views are repeated in several essays, thus each of them can be read independently. While most are digestible for readers without any background knowledge in particle physics, a general understanding of the Standard Model would help with grasping the content of some of the paragraphs. Having said that, the general reader can still follow the big picture and logically-argued thoughts.

Several essays talk about CERN. More specifically, the “The Higgs, and beyond” article was written before the announcement of the Higgs boson discovery in 2011, and briefly presents the possibility of technicolour forces. The following essay, “Why the Higgs?”, was commissioned just after the announcement in 2012 to explain “what all the fuss is about”.

One of the most curious essays to explore is number 24. Citing Weinberg: “Essay 24 has not been published until now because everyone who read it disagreed with it, but I am fond of it so bring it out here.” There, he draws parallels between his job as a theoretical physicist and the one of creative artists.

Not all scientists are able to write in such an unconstrained and accessible way. Despair, sorrow, frustration, doubt, uneasiness and wishes all emerge page after page, offering the reader the privilege of coming closer to one of the sharpest scientific minds of our era.

The post Third Thoughts appeared first on CERN Courier.

]]>
Review Letizia Diamante reviews in 2018 Third Thoughts. https://cerncourier.com/wp-content/uploads/2018/10/CCOct18Book-Weinberg.jpg
From Stars to States: A Manifest for Science in Society https://cerncourier.com/a/from-stars-to-states-a-manifest-for-science-in-society/ Fri, 28 Sep 2018 14:36:15 +0000 https://preview-courier.web.cern.ch/?p=101321 James Gilles reviews in 2018 From Stars To States: A Manifest for Science in Society.

The post From Stars to States: A Manifest for Science in Society appeared first on CERN Courier.

]]>
By Thierry Courvoisier
Springer

From Stars to States

This book is a curiosity, but like many curiosities, well worth stumbling across. It is the product of a curious, roving mind with a long and illustrious career dedicated to the exploration of nature and the betterment of society. Pieced together with cool scientific logic, it takes the reader from a whistle-stop tour of modern astronomy through the poetry collection of Jocelyn Bell-Burnell, to a science-inspired manifesto for the future of our planet. After an opening chapter tracing the development of astronomy from the 1950s to now, subsequent chapters show how gazing at the stars, and learning from doing so, has brought benefit to people from antiquity to modern times across a wide range of disciplines.

Astronomy helped our ancestors to master time, plant crops at the right moment, and navigate their way across wide oceans. There’s humour in the form of speculation about the powers of persuasion of those who convinced the authorities of the day to build the great stone circles that dot the ancient world, allowing people to take time down from the heavens. These were perhaps the Large Hadron Colliders of their time, and, in Courvoisier’s view, probably took up a considerably larger fraction of ancient GDP (gross domestic product) than modern scientific instruments. John Harrison’s remarkable clocks are given pride of place in the author’s discussion of time, though the perhaps even more remarkable Antikythera mechanism is strangely absent.

By the time we reach chapter three, the beginnings of a virtuous circle linking basic science to technology and society are beginning to appear, and we can start to guess where Courvoisier is taking us. The author is not only an emeritus professor of astronomy at the University of Geneva, but also a former president of the Swiss Academy of Sciences and current president of EASAC, the European Academies Science Advisory Council. For good measure, he is also president of the H Dudley Wright Foundation, a charitable organisation that supports science communication activities, mainly in French-speaking Switzerland. He is, in short, a living, breathing link between science and society.

In chapter four, we enjoy the cultural benefits of science and the pleasure of knowledge for its own sake. We have a glimpse of what in Swiss German is delightfully referred to as Aha Erlebnis – that eureka moment when ideas just fall into place. It reminded me of the passage in another curious book, Kary Mullis’s Dancing Naked in the Mindfield, in which Mullis describes the Aha Erlebnis that led to him receiving the Nobel Prize in Chemistry in 1993. It apparently came to him so strongly out of the blue on a night drive along a California freeway that he had to pull off the road and write it down. Einstein’s famous 1% inspiration may be rare, but what a wonderful thing it is when it happens.

Chapter five begins the call to action for scientists to take up the role that their field demands of them in society. “We still need to generate the culture required to […] bring existing knowledge to places where it can and must contribute to actions fashioning the world.” Courvoisier examines the gulf between the rational world of science and the rather different world of policy – a gulf once memorably described by Lew Korwarski in his description of the alliance between scientists and diplomats that led to the creation of CERN. “It was a pleasure to watch the diplomats grapple with the difference between a cyclotron and a plutonium atom,” he said. “We had to compensate by learning how to tell a subcommittee from a working party, and how – in the heat of a discussion – to address people by their titles rather than their names. Each side began to understand the other’s problems and techniques; a mutual respect grew in place of the traditional mistrust between egg-headed pedants and pettifogging hair-splitters.” CERN is the resulting evidence for the good that comes when science and policy come together.

As we reach the business end of the book, we find a rallying call for strengthening our global institutions, and here another of Courvoisier’s influences comes to the fore. He’s Swiss, and a scientist. Scientists have long understood the benefits of collaboration, and if there is one country in the world that has managed to reconcile the nationalism of its regions with the greater need of the supra-cantonal entity of the country as a whole, it is Switzerland. It would be a gross oversimplification to say that Courvoisier’s manifesto is to apply the Swiss model to global governance, but you get the idea.

Originally published in French by the Geneva publisher Georg, if there’s one criticism I have of the book, it’s the translation. It made Catherine Bréchignac, who speaks with fluidity in French, come across as rather clunky in her introduction, and on more than one occasion I found myself wondering if the words I was reading were really expressing what the author wanted to say. Springer and the Swiss Academy of Sciences are to be lauded for bringing this manifesto to an Anglophone audience, but for those who read French, I’d recommend the original.

The post From Stars to States: A Manifest for Science in Society appeared first on CERN Courier.

]]>
Review James Gilles reviews in 2018 From Stars To States: A Manifest for Science in Society. https://cerncourier.com/wp-content/uploads/2018/10/CCOct18Book-Courvoisier.jpg
Preserving European unity in physics https://cerncourier.com/a/viewpoint-preserving-european-unity-in-physics/ Wed, 26 Sep 2018 08:30:17 +0000 https://preview-courier.web.cern.ch/?p=12740 As the EPS turns 50, building scientific bridges across political divides remains as vital as ever, argues Rüdiger Voss.

The post Preserving European unity in physics appeared first on CERN Courier.

]]>
The European Physical Society

The year 1968 marked a turning point in the history of post-war Europe that remains engraved in our collective memory. Global politics were marked by massive student unrest, the Cold War and East–West confrontation. On 21 August the Soviet Union and other Warsaw Pact states invaded Czechoslovakia to crush the movement of liberalisation, democratisation and civil rights, which had become known as the Prague Spring.

Against this background, it seems a miracle that the European Physical Society (EPS) was established only a few weeks later, on 26 September, with representatives of the Czechoslovak Physical Society and the USSR Academy of Sciences sitting at the same table. The EPS was probably the first learned society in Europe involving physicists from both sides of the Iron Curtain. Ever since, building scientific bridges across political divides has been core to the society’s mission.

The EPS was founded in Geneva not by accident. Whereas CERN did not play a formal role, the CERN model of European cooperation made a substantial impact on the genesis of the new society. CERN was at that time principally an organisation of Western European states, but it had started early to develop scientific collaboration with the Soviet Union and other Eastern countries, notably through the Joint Institute for Nuclear Research in Dubna. Leading CERN physicists – including Director-General Bernard Gregory – were instrumental in setting up the new society; Gilberto Bernardini, who had been CERN’s first director of research in 1960–1961 and was a strong advocate of international collaboration in science, became the first EPS president. From the 20 national physical societies and similar organisations that participated in the 1968 foundation, this has now grown to 42, covering almost all of Europe plus Israel, and representing more than 130,000 members. In addition, there are about 42 associate members – mostly major research institutions including CERN – and, last but not least, around 3500 individual members.

Today, the EPS serves the European physics community in a twofold way: by promoting collaboration across borders and disciplines, through activities such as conferences, publications and prizes; and by reaching out to political decision makers, media and the public to promote awareness of the importance of physics education and research.

Rüdiger Voss

The Iron Curtain is history, but the EPS celebrates its 50th anniversary at a time when new, more complex and subtle political divides are opening up in Europe: the UK’s departure from the European Union (EU) is only the most prominent example. While respecting the result of democratic votes, a continued erosion of European unity will undermine fundamental values and best practices that many of us take for granted: free cross-border collaboration, unrestricted mobility of researchers and students, and access to European funding and infrastructures. For almost 30 years now, in Europe, we have taken such freedoms in science as self-evident. Today, prestigious universities in the heart of Europe are threatened with closure on political grounds, while in other countries physicists are jailed for claiming the right to freely exercise their academic profession. These concerns are not unique to physics and must be addressed by the scientific community at large. The EPS, representing a science with a long tradition and highly developed culture of international collaboration, has a special responsibility to uphold these values.

Against this challenging background, the EPS is undertaking efforts to make the voice of the physics community more clearly heard in European science-policy making, principally through a point of presence in Brussels to facilitate communication with the European Commission and with partner organisations defending similar interests. In an environment where funding opportunities are increasingly organised around societal rather than scientific challenges, the EPS must advocate a healthy and sustained balance between basic and applied research. The next European Framework Programme Horizon Europe must not only provide fair access to funds, research opportunities and infrastructure for researchers from EU countries, but should remain equally open to participation from third countries, following the example of the successful association of countries like Norway and Switzerland with Horizon 2020. Building scientific bridges across political divides remains as vital as ever, in the best interest of a strong cohesion of the European physics community.

The post Preserving European unity in physics appeared first on CERN Courier.

]]>
Opinion As the EPS turns 50, building scientific bridges across political divides remains as vital as ever, argues Rüdiger Voss. https://cerncourier.com/wp-content/uploads/2018/09/CCOct18View-epsCROP.jpg
Classical Field Theory https://cerncourier.com/a/classical-field-theory/ Fri, 31 Aug 2018 14:48:29 +0000 https://preview-courier.web.cern.ch/?p=101339 This book provides a comprehensive introduction to classic field theory, which concerns the generation and interaction of fields and is the logical precursor of quantum field theory.

The post Classical Field Theory appeared first on CERN Courier.

]]>
By Joel Franklin
Cambridge University Press

This book provides a comprehensive introduction to classic field theory, which concerns the generation and interaction of fields and is the logical precursor of quantum field theory. But, while in most university physics programmes students are taught classical mechanics first and then quantum mechanics, quantum field theory is normally not preceded by dedicated classic field theory classes. The author, though, claims that it would be worth giving more room to classical field theory, since it can offer a good way to think about modern physical model building.

The focus is on the relativistic structural elements of field theories, which enable a deeper understanding of Maxwell’s equations and of the electromagnetic field theory. The same also stands for other areas of physics, such as gravity.

The book comprises four chapters and is completed by three appendices. The first chapter provides a review of special relativity, with some in-depth discussion of transformations and invariants. Chapter two focuses on Green’s functions and their role as integral building blocks, offering as examples static problems in electricity and the full wave equation of electromagnetism. In chapter three, Lagrangian mechanics is introduced, together with the notions of a field Lagrangian and of action. The last chapter is dedicated to gravity, another classic field theory. The appendices include mathematical and numerical methods useful for field theories and a short essay on how one can take a compact action and from it develop all the physics known from EM.

Written for advanced-undergraduate and graduate students, this book is meant for dedicated courses on classical field theory, but could also be used in combination with other texts for advanced classes on EM or a course on quantum field theory. It could also be used as a reference text for self-study.

The post Classical Field Theory appeared first on CERN Courier.

]]>
Review This book provides a comprehensive introduction to classic field theory, which concerns the generation and interaction of fields and is the logical precursor of quantum field theory. https://cerncourier.com/wp-content/uploads/2018/08/CCSep18Book-classical.jpg
From Photon to Neuron: Light, Imaging, Vision https://cerncourier.com/a/from-photon-to-neuron-light-imaging-vision/ Fri, 31 Aug 2018 14:48:28 +0000 https://preview-courier.web.cern.ch/?p=101336 Luis Álvarez-Gaumé reviews in 2018 From Photon To Neuron: Light, Imaging, Vision.

The post From Photon to Neuron: Light, Imaging, Vision appeared first on CERN Courier.

]]>
By Philip Nelson
Princeton University Press 2017

This book is as elegant as it is deep. A masterful tour of the science of light and vision. It goes beyond artificial boundaries between disciplines and presents all aspects of light as it appears in physics, chemistry, biology and the neural sciences.

The text is addressed to undergraduate students, an added challenge to the author, which is met brilliantly. Since many of the biological phenomena involved in our perception of light (in photosynthesis, image formation and image interpretation) happen ultimately at the molecular level, one is introduced rather early to the quantum treatment of the particles that form light: photons. And when they are complemented with the particle-wave duality characteristic of quantum mechanics, it is much easier to understand a large palette of natural phenomena without relying on the classical theory of light, embodied by Maxwell’s equations, whose mathematical structure is far more advanced than what is required. This classical approach has the problem that eventually one needs the quantisation of the electromagnetic field to bring photons into the picture. This would make the text rather unwieldly, and not accessible to a majority of undergraduates or biologists working in the field.

In the same way that the author instructs non-physics students in some basic physics concepts and tools, he also provides physicists with accessible and very clear presentations of many biological phenomena involving light. This is a textbook, not an encyclopaedia, hence a selection of such phenomena is necessary to illustrate the concepts and methods needed to develop the material. There are sections at the end of most chapters containing more advanced topics, and also suggestions for further reading to gain additional insight, or to follow some of the threads left open in the main text of the chapter.

A cursory perusal of the table of contents at the beginning will give the reader an idea of the breadth and depth of material covered. There is a very accessible presentation of the theory of colour, from a physical and biological point of view, and its psychophysical effects. The evolution of the eye and of vision at different stages of animal complexity, imaging, the mechanism of visual transduction and many more topics are elegantly covered in this remarkable book.

The final chapters contain some advanced topics in physics, namely, the treatment of light in the theory of quantum electrodynamics. This is our bread and butter in particle physics, but the presentation is more demanding on the reader than any of the previous chapters.

Unlike chapter zero, which explains the rudiments of probability theory in the standard frequentist and Bayesian approaches that can be understood basically by anyone familiar with high-school mathematics, chapters 12 and 13 require a more substantial background in advanced physics and mathematics.

The gestalt approach advocated by this book provides one of the most insightful, cross-disciplinary texts I have read in many years. It is mesmerising and highly recommendable, and will become a landmark in rigorous, but highly accessible interdisciplinary literature.

The post From Photon to Neuron: Light, Imaging, Vision appeared first on CERN Courier.

]]>
Review Luis Álvarez-Gaumé reviews in 2018 From Photon To Neuron: Light, Imaging, Vision. https://cerncourier.com/wp-content/uploads/2018/08/CCSep18Book-Photon.jpg
Applied Computational Physics https://cerncourier.com/a/applied-computational-physics/ Fri, 31 Aug 2018 14:48:28 +0000 https://preview-courier.web.cern.ch/?p=101337 This book aims to provide physical sciences students with the computational skills that they will need in their careers and expose them to applications of programming to problems relevant to their field of study.

The post Applied Computational Physics appeared first on CERN Courier.

]]>
By Joseph Boudreau and Eric Swanson
Oxford University Press

This book aims to provide physical sciences students with the computational skills that they will need in their careers and expose them to applications of programming to problems relevant to their field of study. The authors, who are professors of physics at the University of Pittsburgh, decided to write this text to fill a gap in the current scientific literature that they noticed while teaching and training young researchers. Often, graduate students have only basic knowledge of coding, so they have to learn on the fly when asked to solve “real world” problems, like those involved in physics research. Since this way of learning is not optimal and sometimes slow, the authors propose this guide for a more structured study.

Over almost 900 pages, this book introduces readers to modern computational environments, starting from the foundation of object-oriented computing. Parallel computation concepts, protocols and methods are also discussed early in the text, as they are considered essential tools.

The book covers various important topics, including Monte Carlo methods, simulations, graphics for physicists and data modelling, and gives large space to algorithmic techniques. Many chapters are also dedicated to specific physics applications, such as Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, etc. Nearly 400 exercises of varying difficulty complete the text.

Even though most of the examples come from experimental and theoretical physics, this book could also be very useful for students in chemistry, biology, atmospheric science and engineering. Since the numerical methods and applications are sometimes technical, it is particularly appropriate for graduate students.

The post Applied Computational Physics appeared first on CERN Courier.

]]>
Review This book aims to provide physical sciences students with the computational skills that they will need in their careers and expose them to applications of programming to problems relevant to their field of study. https://cerncourier.com/wp-content/uploads/2018/08/CCSep18Book-applied.jpg
Quantum Field Theory Approach to Condensed Matter Physics https://cerncourier.com/a/quantum-field-theory-approach-to-condensed-matter-physics/ Fri, 31 Aug 2018 14:48:28 +0000 https://preview-courier.web.cern.ch/?p=101338 This book provides an excellent overview of the state of the art of quantum field theory (QFT) applications to condensed-matter physics (CMP).

The post Quantum Field Theory Approach to Condensed Matter Physics appeared first on CERN Courier.

]]>
By Eduardo C Marino
Cambridge University Press

This book provides an excellent overview of the state of the art of quantum field theory (QFT) applications to condensed-matter physics (CMP). Nevertheless, it is probably not the best choice for a first approach to this wonderful discipline.

QFT is used to describe particles in the relativistic (high-energy intensity) regime, but, as is well known, its methods can also be applied to problems involving many interacting particles – typically electrons. The conventional way of studying solid-state physics and, in particular, silicon devices does not make use of QFT methods due to the success of models in which independent electrons move in a crystalline substrate. Currently, though, we deal with various condensed-matter systems that are impervious to that simple model and could instead profit from QFT tools. Among them: superconductivity beyond the Bardeen–Cooper–Schrieffer approach (high-temperature superconducting cuprates and iron-based superconductors), the quantum Hall effect, conducting polymers, graphene
and silicene.

The author, as he himself states, aims to offer a unified picture of condensed-matter theory and QFT. Thus, he highlights the interplay between these two theories in many examples to show how similar mechanisms operate in different systems, despite being separated by several orders of magnitude in energy. He discusses, for example, the comparison between the Landau–Ginzburg field of a superconductor with the Anderson–Higgs field in the Standard Model. He also explains the not-so-well-known relation between the Yukawa mechanism for mass generation of leptons and quarks, and the Peierls mechanism of gap generation in polyacetylene: the same trilinear interaction between a Dirac field, its conjugate and a scalar field that explains why polyacetylene is an insulator, is responsible for the mass of elementary particles.

The book is structured into three parts. The first covers conventional CMP (at advanced undergraduate level). The second provides a brief review of QFT, with emphasis on the mathematical analysis and methods appropriate for non-trivial many-body systems (as, in particular, in chapters eight and nine, where a classical and a quantum description of topological excitations are given). I found the pages devoted to renormalisation remarkable, in which the author clearly exposes that the renormalisation procedure is a necessity due to the presence of interactions in any QFT, not to that of divergences in a perturbative approach. The heart of the book is part three, composed of 18 chapters where the author discusses the state of the art of condensed-matter systems, such as topological insulators and even quantum computation.

The last chapter is a clear example of the non-conventional approach proposed by the author: going straight to the point, he does not explain the basics of quantum computation, but rather discusses how to preserve the coherence of the quantum states storing information, in order to maintain the unitary evolution of quantum data-processing algorithms. In his words, “the main method of coherence protection involves excitation, having the so-called non-abelian statistics”, which, going back to CMP, takes us to the realm of anyons and Majorana qubits. In my opinion, this book is not suitable for undergraduate or first-year graduate students (for whom I see as more appropriate, the classic Condensed Matter Field Theory by Altland and Simons). Instead, I would keenly recommend this to advanced graduate students and researchers in the field, who will find, in part three, plenty of hot topics that are very well explained and accompanied by complete references.

The post Quantum Field Theory Approach to Condensed Matter Physics appeared first on CERN Courier.

]]>
Review This book provides an excellent overview of the state of the art of quantum field theory (QFT) applications to condensed-matter physics (CMP). https://cerncourier.com/wp-content/uploads/2022/06/9781107074118-feature.jpg
The day the world switched on to particle physics https://cerncourier.com/a/the-day-the-world-switched-on-to-particle-physics/ Fri, 31 Aug 2018 08:00:38 +0000 https://preview-courier.web.cern.ch/?p=12625 What was it that drove one of the biggest media events science has ever seen, and is the LHC still able to capture the public imagination?

The post The day the world switched on to particle physics appeared first on CERN Courier.

]]>
CERN Control Centre

When Lyn Evans, project leader of the Large Hadron Collider (LHC), turned up for work at the CERN Control Centre (CCC) at 05:30 on 10 September 2008, he was surprised to find the car park full of satellite trucks. Normally a scene of calm, the facility had become the focus of global media attention, with journalists poised to capture the moment when the LHC switched on. Evans knew the media were coming, but not quite to this extent. A few hours later, as he counted down to the moment when the first beam had made its way through the last of the LHC’s eight sectors, the CCC erupted in cheers – and Evans wasn’t even aware that his impromptu commentary was being beamed live to millions of people. “I thought I was commenting to others on the CERN site,” he recalls. The following weekend, he was walking in the nearby ski town of Megève when a stranger recognised him in the street.

Of all human endeavours that have captured the world’s attention, the events of 10 September 2008 are surely among the most bizarre. After all, this wasn’t something as tangible as sending a person to the Moon. At 10:28 local time on that clear autumn Wednesday, a bunch of subatomic particles made its way around a 27 km-long subterranean tube, and the spectacle was estimated to have reached an audience of more than a billion people. There were record numbers of hits to the CERN homepage, overtaking visits to NASA’s site, in addition to some 2500 television broadcasts and 6000 press articles on the day. The event was dubbed “first-beam day” by CERN and “Big Bang day” by the BBC, which had taken over a room in the CCC and devoted a full day’s coverage on Radio 4. Google turned its logo into a cartoon of a collider – such “doodles” are now commonplace, but it was a coup for CERN back then. It is hard to think of a bigger media event in science in recent times, and it launched particle physics, the LHC and CERN into mainstream culture.

It is all the more incredible that no collision data, and therefore no physics results, were scheduled that day; it was “simply” part of the commissioning period that all new colliders go through. When CERN’s previous hadron collider, the Super Proton Synchrotron, fired up in the summer of 1981, says Evans, there was just him and Carlo Rubbia in the control room. Even the birth of the Large Electron Positron collider in 1989 was a muted affair. The LHC was a different machine in a different era, and its birth offers a crash course in the communication of big-science projects.

News values

Fears that the LHC would create a planet-eating black hole were a key factor behind the enormous media interest, says Roger Highfield, who was science editor of the UK’s The Telegraph newspaper at the time. “I have no doubt that the public loved all the stuff about the hunt for the secrets of the universe, the romance of the Peter Higgs story and the deluge of superlatives about energy, vacuum and all that,” says Highfield. “But the LHC narrative was taken to a whole new level by the potty claim by doomsayers that it could create a black hole to swallow the Earth. When ‘the biggest and most complex experiment ever devised’ was about to be turned on, it made front-page news, with headlines like, ‘Will the world end on Wednesday?’”.

Journalists

The conspiracies were rooted in attempts by a handful of individuals to prevent the LHC from starting up in case its collisions would produce a microscopic black hole – one of the outlandish models that the LHC was built to test. That the protons injected into the LHC that day had an energy far lower than that of the then-operational Tevatron collider in the US, and that collisions were not scheduled for weeks afterwards, didn’t seem to get in the way of a good story. Nor, for that matter, did CERN’s efforts to issue scientific reassurances. Indeed, when science editor of The Guardian, Ian Sample, turned up at CERN on first-beam day, he expected to find protestors chained to the fence outside, or at least waving placards asking physicists not to destroy the planet. “I did not see a single protestor – and I looked for them,” he says. “And yet, inside the building, I remember one TV host doing a piece to camera on how the world might end when the machine switched on. It was a circus that the media played a massive part in creating. It was shameful and it made the media who seriously ran with those stories look like fools.”

The truth is the black-hole hype came long after the LHC had started to capture the public imagination. As the machine and its massive experiments progressed through construction in the early 2000s, the project’s scale and abstract scientific goals offered an appeal to wonder. Though designed to explore a range of phenomena at a new energy frontier, the LHC’s principal quarry, the Higgs boson, had a bite-sized description: the generator of mass. It also had a human angle – a real-life, white-haired Professor Higgs and a handful of other theorists waiting to see if their half-century-old prediction was right, and international teams of thousands working night and day to build the necessary equipment. Nobel laureate Leon Lederman’s 1993 book The God Particle, detailing the quest for the Higgs boson, added a supernatural dimension to the enterprise.

“I am confident that no editor-in-chief of any newspaper in the world truly understood the Higgs field, the meaning or significance of electroweak symmetry breaking, or how the Higgs boson fits into the picture,” continues Sample. “But what they did get was the appeal of hunting for a particle that in their minds explained the origin of mass. It is such an intriguing concept to imagine that we even need to explain the origin of mass. Isn’t it the case that matter just has mass, plain and simple? All of this, in addition to the sheer awe at the engineering and physics achievement, made for an enormously exotic and appealing story.”

There were also more practical reasons for LHC’s media extravaganza, notes Geoff Brumfiel, a reporter at Nature at the time and now a senior editor at National Public Radio in the US. The fact that pretty much every country and region on Earth had somebody working on the LHC meant that there was a local story for thousands of news outlets, he says, plus CERN’s status as a publicly funded institution made it possible for the lab to open up to the world. “There was also great visual appeal: the enormous, colourful detectors, deep underground, teeming with little scientists in hard hats – it just looked cool. That was hugely important for cable news, television documentary producers, etc.” In addition, says Brumfiel, something actually happened on first-beam day – there was something for journalists to see. “That’s always big in the news business. A big new machine was turning on and might or might not work. And when it worked there were lots of happy people to look at and hear.”

Strategy first

Despite the many external factors influencing LHC communications, the switch-on would never have had the huge reach that it did were it not for a dedicated communication strategy, says James Gillies, CERN’s head of communications at the time. It started as far back as 2000, when Dan Brown’s science-fiction novel Angels & Demons, about a plot to blow up the Vatican using antimatter stolen from CERN, was published. “Luckily for us, it didn’t sell, but it alerted us to the fact that the notion that CERN could be dangerous was bubbling up into popular culture,” says Gillies. A few years later, the BBC made a drama documentary called End Day, which examined a range of ways that humanity might not last the century – including a black hole being created at a particle accelerator. Then, when Dan Brown’s next book, The Da Vinci Code, became a bestseller, CERN realised that Angels & Demons would be next on people’s reading list – so it had better act. “That led to one of the most peculiar conversations that I’ve ever had with a CERN director general, and resulted in us featuring fact and fiction in Angels & Demons on the CERN website,” says Gillies. “Our traffic jumped by an order of magnitude overnight and we never looked back.” CERN later played a significant role in the screen adaption of the book, and Sony Pictures included a short film about CERN in its Blu-ray release.

Scenes from first-beam day

The first dedicated LHC communications strategy was put in place in 2006. The perception of CERN as portrayed in End Day and Angels & Demons was so wide off the mark that is was laughable, says Gillies, so he took it as opportunity to lead the conversation about CERN and be transparent and timely. In addition to actions such as working with science communicators in CERN Member States and beyond, to organise national media visits for key journalists, he says, “the big idea is that we took a conscious decision to do our science in the public eye, to involve people in the adventure of research at the forefront of human knowledge”. Publicly fixing the date for first beam was a high-risk strategy, but it paid off. The scheduled LHC start-up exceeded the expectations of everyone involved. Both proton beams made a full turn around the machine and one beam was captured by the radio-frequency system, showing that it could be accelerated. For the thousands of people working on the LHC and its experiments, it marked the transition from 25 years of preparation to a new era of scientific discovery. But the terrain was about to get tougher.

Once the journalists had departed and the champagne bottles were stacked away, the LHC teams continued with the task of commissioning away from the spotlight, with a view to obtaining collisions as soon as possible. Then, a couple of days after first-beam day, a transformer powering part of the LHC’s cryogenic system failed, forcing a pause in commissioning during which the teams decided to test the last octant of the machine for high-current operations. While ramping the magnets towards 9.3 kA on 19 September, one of the LHC’s 10,000 superconducting-dipole interconnects failed, ultimately damaging roughly 400 m of the machine. Evans described the event, which set operations back by 14 months, as “a kick in the teeth”. But CERN recovered quickly (see “Lessons from the accelerator frontier“) and, today, Evans says that he is glad that the fault was discovered when it was. “It would have been a disaster had it happened five years in. As it was, we didn’t come under criticism. We were pushing the limits of technology.”

The timing of the incident was doubly fortuitous: the same week it took place, US investment bank Lehman Brothers filed for the largest bankruptcy in history, with other banks looking set to follow suit. The world might not have been consumed by a black hole, but the prospect of a distinctly more real financial Armageddon dominated the headlines that week.

To collisions and beyond

The coming to life of the LHC is a thrilling story, a scientific fairy-tale. From its long-awaited completion, to the tense sector-by-sector threading of its first beam in front of millions of people and the incident nine days later that temporarily ruined the party, the LHC finally arrived at a new energy frontier in November 2009 (achieving 1.18 TeV per beam). Its physics programme began in earnest a few months later, on 30 March 2010, at a collision energy of 7 and then 8 TeV. Barely two years later, the LHC produced its first major discovery – the Higgs boson, announced to a packed CERN auditorium on 4 July 2012 by the ATLAS and CMS collaborations and webcast around the world. The discovery was followed by the award of the 2013 Nobel Prize in Physics to Peter Higgs and François Englert. The CERN seminar was the first time that the pair had met, with cameras capturing Higgs wiping a tear from his eye as the significance of the event sunk in. Since 2015, the LHC has been operating at 13 TeV while notching up record levels of performance, and the machine is now being prepared for its high-luminosity upgrade (HL-LHC).

VIPs

Has the success of LHC communications set the bar too high? The CERN press office tracked a steady increase in the number of LHC-related articles in the period leading up to the switch-on, in addition to an increasing number of visits by the media and the public. Coverage peaked around September 2008, died down a little, then picked up again four years later as the drama of the Higgs-boson discovery started to unfold. When ATLAS and CMS announced the discovery, press coverage exceeded even that of first-beam day. Of the top 10-read items on The Guardian website, says Sample, stories about the Higgs made up eight or nine of them, when there were plenty of other big news stories around that day. Why? “The absolute competence and dedication and hard work of those scientists and engineers was so refreshing compared to the crooks, bullies, liars and murderers that we write about every day,” he says. “Perhaps people enjoyed reading about something positive, about people doing astounding work, about something far bigger than the world they normally encounter in the news.”

Today, press coverage of the LHC remains higher than it was before the switch-on, with an average of 200 clippings per day worldwide. The number of media visits to CERN, having peaked in around 2008 and 2012, is now at the level that it was before the switch-on, corresponding to around 300 media outlets per year. The LHC’s life so far has also coincided with the explosion of social-media tools. CERN’s first ever tweet, on 7 August 2008, announced the date for first-beam day, and today the lab has more than two million Twitter followers – rising at a rate of around 1000 per day. During the announcement of the Higgs-boson discovery in 2012, CERN’s live tweets reached journalists faster than the press release and helped contribute to worldwide coverage of the news.

Framing the search for the Higgs boson as the LHC’s only physics goal was never the message that CERN intended to put out, but it’s the one that the media latched on to. Echoing others working in the media who were interviewed for this article, Brumfiel thinks that the LHC has largely left the public eye. In terms of the media, he says, “It’s a victim of its own success: it was designed to do one thing, and it’s done it.”

The challenge facing communications at CERN today is how to capitalise on the existing interest while constructing a new or updated narrative of exploration and discovery. After all, in terms of physics measurements, the LHC is only getting into its stride – having collected just 5% of its expected total dataset and with up to two decades of operations still to go. Although the LHC has not yet found any conclusive signs of physics beyond the Standard Model, it is clear from astronomical and other observations that such phenomena are out there, somewhere. In the absence of direct discoveries, identifying the new physics will be a hard slog involving ever more precise measurements of known particles – a much tougher sell to the public, even if it is all part of the same effort to uncover the basic laws of the universe.

Angels & Demons

“CERN has managed to build upon previous communication successes as the public is already interested, so they can simply strap a camera onto a drone, fly it around and a lot of people will happily watch!” says David Eggleton of the Science Policy Research Unit at the University of Sussex in the UK, who studies leadership and governance in major scientific projects such as the LHC. “But, just like with the scientists, the public is going to need something new and exciting to focus on – even if the pay-off is 10 years in the future, so it depends on how the laboratory wants to strategise – do they want to pitch HL-LHC as the next big machine or is it just going to be articulated as an upgrade with the FCC (Future Circular Collider) becoming the thing to capture the public’s imagination?”

Theoretical physicist and science populariser Sabine Hossenfelder of the Frankfurt Institute for Advanced Studies in Germany thinks the excitement surrounding the switch-on of the LHC has come back to haunt the field, going so far as to label the current situation in particle physics a “PR disaster”. Before the LHC’s launch in 2008, she says, some theorists expressed themselves confident that the collider would produce new particles besides the Higgs boson. “That hasn’t happened. The big proclamations came almost exclusively from theoretical physicists; CERN didn’t promise anything that they didn’t deliver. That is an important distinction, but I am afraid in the public perception the subtler differences won’t matter.”

Cultural icon

At least for now, and in some countries, the LHC has become embedded in popular culture. The term “hadron collider” is the new “rocket science” – a term dropped into commentary and public discourse to denote the pinnacle of human ingenuity. The LHC has inspired books, films, plays, art and, crucially, adverts – in which firms have used high-production visuals to associate their brands with the standards of the LHC. The number of applications for physics degrees, in the UK at least, soared around the time that the LHC switched on, and the event also launched the television career of ATLAS physicist Brian Cox, who went on to further engage a primed public. Annually, around 300,000 people apply to visit CERN, less than half of whom can be accommodated.

Press conference

If the communications surrounding the LHC have proved one thing, it is that there is an inherent interest among huge swathes of the global population in the substance of particle physics. Highfield, who is now director of external affairs at the Science Museum in the London, sees this on a daily basis. “Although I think physicists would have liked to have seen more surprises, I know from my work at the Science Museum that the public has a huge appetite for smashing physics,” he says. In November 2013, the Science Museum launched Collider, an immersive exhibition that blended theatre, video and sound art with real artefacts from CERN to recreate a visit to the laboratory. The exhibition went on international tour, finishing in Australia in April 2017, having pulled in an audience of more than 600,000 people. “Yes, the public still cares about the quest to reveal the deepest secrets of the cosmos,” says Highfield.

From a communications perspective, the switch-on of the LHC proves the importance of a clear strategy, the rewards from taking risks, and the difficulty in keeping control of a narrative. For Evans, the LHC changed everything. “Of all the machines that I’ve worked on, never before has there been such interest,” he says. “Before the LHC, no one knew what you were talking about. Now, I can get into a cab in New York or speak to an immigration officer in Japan, and they say: oh, cool, you work at CERN?”.

The post The day the world switched on to particle physics appeared first on CERN Courier.

]]>
Feature What was it that drove one of the biggest media events science has ever seen, and is the LHC still able to capture the public imagination? https://cerncourier.com/wp-content/uploads/2018/08/CCSep18Media-frontis.jpg
Cosmic Anger: Abdus Salam – The First Muslim Nobel Scientist https://cerncourier.com/a/cosmic-anger-abdus-salam-the-first-muslim-nobel-scientist/ Sun, 19 Aug 2018 08:38:18 +0000 https://preview-courier.web.cern.ch/?p=105078 Hafeez Hoorani reviews in 2008 Cosmic Anger: Abdus Salam – The First Muslim Nobel Scientist.

The post Cosmic Anger: Abdus Salam – The First Muslim Nobel Scientist appeared first on CERN Courier.

]]>
by Gordon Fraser. Oxford University Press. Hardback ISBN 9780199208463 £25 ($49.95).

The late Abdus Salam – the only Nobel scientist from Pakistan – came from a small place in the Punjab called Jhang. The town is also famous for “Heer-Ranjha”, a legendary love story of the Romeo-and-Juliet style that has a special romantic appeal in the countryside around the town. Salam turned out to be another “Ranjha” from Jhang, whose first love happened to be theoretical physics. Cosmic Anger, Salam’s biography by Gordon Fraser, is a new, refreshing look at the life of this scientific genius from Pakistan.

CCboo1_09_08

I have read several articles and books about Salam and also met him several times, but I still found Fraser’s account instructive. What I find intriguing and interesting about Cosmic Anger is first the title, and second that each chapter of the book gives sufficient background and historical settings of the events that took place in the life of Salam. In this regard the first three chapters are especially interesting, in particular the third, where the author talks about Messiahs, Mahdis and Ahmadis. This shows in a definitive way the in-depth knowledge that Fraser has about Islam and the region where Salam was born.

In chapter 10, Fraser discusses the special relationship between Salam and the former President of Pakistan, Ayub Khan. I feel that more emphasis was required about the fact that for 16 years, from 1958 to 1974, Salam had the greatest influence on the scientific policies of Pakistan. On 4 August 1959, while inaugurating the Atomic Energy Commission, President Ayub said: “In the end, I must say how happy I am to see Prof. Abdus Salam in our midst. His attainments in the field of science at such a young age are a source of pride and inspiration for us and I am sure that his association with the commission will help to impart weight and prestige to the recommendations.” Salam was involved in setting up the Atomic Energy Commission and other institutes such as the Pakistan Institute of Nuclear Science and Technology and the Space and Upper Atmosphere Research Commission in Pakistan.

Finally, I find the book to be a well written account of the achievements of a genius who was a citizen of the world, destined to play a memorable role in the global development of science and technology. At the same time, in many ways Salam was very much a Pakistani. In the face of numerous provocations and frustrations, he insisted on keeping his nationality. He loved the Pakistani culture, its language, its customs, its cuisine and its soil where he was born and is buried.

The post Cosmic Anger: Abdus Salam – The First Muslim Nobel Scientist appeared first on CERN Courier.

]]>
Review Hafeez Hoorani reviews in 2008 Cosmic Anger: Abdus Salam – The First Muslim Nobel Scientist. https://cerncourier.com/wp-content/uploads/2008/08/CCboo1_09_08.jpg
Gravitational Waves Vol 1: Theory and Experiments https://cerncourier.com/a/gravitational-waves-vol-1-theory-and-experiments/ Sun, 19 Aug 2018 08:38:18 +0000 https://preview-courier.web.cern.ch/?p=105079 Carlo Bradaschia reviews in 2009 Gravitational Waves Vol 1: Theory and Experiments.

The post Gravitational Waves Vol 1: Theory and Experiments appeared first on CERN Courier.

]]>
By Michele Maggiore, Oxford University Press. Hardback ISBN 9780198570745 £45 ($90).

This is a complete book for a field of physics that has just reached maturity. Gravitational wave (GW) physics recently arrived at a special stage of development. On the theory side, most of the generation mechanisms have been understood and some technical controversies have been settled. On the experimental side, several large interferometers are now operating around the world, with sensitivities that could allow the first detection of GWs, even if with a relatively low probability. The GW community is also starting vigorous upgrade programmes to bring the detection probability to certitude in less than a decade from now.

The need for a textbook that treats the production and detection of GWs systematically is clear. Michele Maggiore has succeeded in doing this in a way that is fruitful not only for the young physicist starting to work in the field, but also for the experienced scientist needing a reference book for everyday work.

CCboo2_09_08

In the first part, on theory, he uses two complementary approaches: geometrical and field-theoretical. The text fully develops and compares both, which is of great help for a deep understanding of the nature of GWs. The author also derives all equations completely, leaving just the really straightforward algebra for the reader. A basic knowledge of general relativity and field theory is the only prerequisite.

Maggiore explains thoroughly the generation of gravitational radiation by the most important astrophysical sources, including the emitted power and its frequency distribution. One full chapter is dedicated to the Hulse-Taylor binary pulsar, which constituted the first evidence for GW emission. The “tricky” subject of post-Newtonian sources is also clearly introduced and developed. Exercises that are completely worked out conclude most of these theory chapters, enhancing the pedagogical character of the book.

The second part is dedicated to experiments and starts by setting up a background of data-analysis techniques, including noise spectral density, matched filtering, probability and statistics, all of which are applied to pulse and periodic sources and to stochastic backgrounds. Maggiore treats resonant mass detectors first, because they were the first detectors chronologically to have the capability of detecting signals, even if only strong ones originating in the neighbourhood of our galaxy. The study of resonant bar detectors is instructive and deals with issues that are also very relevant to understanding interferometers. The text clearly explains fundamental physics issues, such as approaching the quantum limits and quantum non-demolition measurements.

The last chapter is devoted to a complete and detailed study of the large interferometers – the detectors of the current generation – which should soon make the first detection of GWs. It discusses many details of these complex devices, including their coupling to gravitational waves, and it makes a careful analysis of all of the noise sources.

Lastly, it is important to remark on a little word that appears on the cover: “Volume 1”. As the author explains in the preface, he is already working on the second volume. This will appear in a few years and will be dedicated to astrophysical and cosmological sources of GWs. The level of this first book allows us to expect an interesting description of all “we can learn about nature in astrophysics and cosmology, using these tools”.

The post Gravitational Waves Vol 1: Theory and Experiments appeared first on CERN Courier.

]]>
Review Carlo Bradaschia reviews in 2009 Gravitational Waves Vol 1: Theory and Experiments. https://cerncourier.com/wp-content/uploads/2008/08/CCboo2_09_08.jpg
The Cosmological Singularity https://cerncourier.com/a/the-cosmological-singularity/ Mon, 09 Jul 2018 14:58:50 +0000 https://preview-courier.web.cern.ch/?p=101356 Quite technical and advanced, this book is meant for theoretical and mathematical physicists working on general relativity, supergravity and cosmology.

The post The Cosmological Singularity appeared first on CERN Courier.

]]>
By Vladimir Belinski and Marc Henneaux
Cambridge University Press

This monograph discusses at length the structure of the general solution of the Einstein equations with a cosmological singularity in Einstein-matter systems in four and higher space–time dimensions, starting from the fundamental work of Belinski (the book’s lead author), Khalatnikov and Lifshitz (BKL) – published in 1969.

The text is organised in two parts. The first, comprising chapters one to four, is dedicated to an exhaustive presentation of the BKL analysis. The authors begin deriving the oscillatory, chaotic behaviour of the general solution for pure Einstein gravity in four space–time dimensions by following the original approach of BKL. In chapters two and three, homogeneous cosmological models and the nature of the chaotic behaviour near the cosmological singularity are discussed. In these three chapters, the properties of the general solution of the Einstein equation are studied in the case of empty space in four space–time dimensions. The fourth chapter instead deals with different systems: perfect fluids in four space–time dimensions; gauge fields of the Yang–Mills and electromagnetic types and scalar fields, also in four space–time dimensions; and pure gravity in higher dimensions.

The second part of the book (chapters five to seven) is devoted to a model in which the chaotic oscillations discovered by BKL can be described in terms of a “cosmological billiard” system. In chapter five, the billiard description is provided for pure Einstein gravity in four dimensions, without any simplifying symmetry assumption, while the following chapter extends this analysis to arbitrary higher space–time dimensions and to general systems containing gravity coupled to matter fields. Finally, chapter seven covers the intriguing connection between the BKL asymptotic regime and Coxeter groups of reflections in hyperbolic space. Four appendices complete the treatment.

Quite technical and advanced, this book is meant for theoretical and mathematical physicists working on general relativity, supergravity and cosmology.

The post The Cosmological Singularity appeared first on CERN Courier.

]]>
Review Quite technical and advanced, this book is meant for theoretical and mathematical physicists working on general relativity, supergravity and cosmology. https://cerncourier.com/wp-content/uploads/2022/06/41vUoeSUK1L.jpg
Gravitational Lensing https://cerncourier.com/a/gravitational-lensing/ Mon, 09 Jul 2018 14:58:49 +0000 https://preview-courier.web.cern.ch/?p=101354 Based on university lectures given by the author, this book provides an overview of gravitational lensing, which has emerged as a powerful tool in astronomy with numerous applications, ranging from the quest for extrasolar planets to the study of the cosmic mass distribution.

The post Gravitational Lensing appeared first on CERN Courier.

]]>
By Scott Dodelson
Cambridge University Press

Based on university lectures given by the author, this book provides an overview of gravitational lensing, which has emerged as a powerful tool in astronomy with numerous applications, ranging from the quest for extrasolar planets to the study of the cosmic mass distribution.

Gravitational lensing is a consequence of general relativity (GR): the gravitational field of a massive object causes light rays passing close to it to bend and refocus somewhere else. As a consequence, any treatment of this topic has to make reference to GR theory; nevertheless, as the author highlights, not much formalism is required to learn how to apply lensing to specific problems. Thus, using very little GR and not too complex mathematics, this text presents the basics of gravitational lensing, focusing on the equations needed to understand the phenomenon. It then dives into a number of applications, including multiple images, time delays, exoplanets, microlensing, cluster masses, galaxy shape measurements, cosmic shear and lensing of the cosmic microwave background.

Written with a pedagogical approach, this book is meant as a textbook for one-semester undergraduate or graduate courses. But it can also be used for independent study by researchers interested in entering this fascinating and fast-evolving field.

The post Gravitational Lensing appeared first on CERN Courier.

]]>
Review Based on university lectures given by the author, this book provides an overview of gravitational lensing, which has emerged as a powerful tool in astronomy with numerous applications, ranging from the quest for extrasolar planets to the study of the cosmic mass distribution. https://cerncourier.com/wp-content/uploads/2022/06/51m2lkWxssL.jpg
Quantum Fields: From the Hubble to the Planck Scale https://cerncourier.com/a/quantum-fields-from-the-hubble-to-the-planck-scale/ Mon, 09 Jul 2018 14:58:49 +0000 https://preview-courier.web.cern.ch/?p=101355 This book treats two fields of physics that are usually taught separately – quantum field theory (QFT) on one side and cosmology and gravitation on the other – in a more unified manner.

The post Quantum Fields: From the Hubble to the Planck Scale appeared first on CERN Courier.

]]>
By Michael Kachelriess
Oxford University Press

This book treats two fields of physics that are usually taught separately – quantum field theory (QFT) on one side and cosmology and gravitation on the other – in a more unified manner. Kachelriess uses this unusual approach because he is convinced that, besides studying a subject in depth, what is often difficult is to put the pieces into a general picture. Thus, he makes an effort
to introduce QFT together with its most important applications to cosmology and astroparticle physics in a coherent framework.

The path-integral approach is employed from the start and the use of tools such as Green’s functions in quantum mechanics and in scalar field-theory is illustrated. Massless spin-1 and spin-2 fields are introduced on an equal footing, and gravity is presented as a gauge theory in analogy with the Yang–Mills case. The book also deals with various concepts relevant to modern research, such as helicity methods and effective theories, as well as applications to advanced research topics.

This volume can serve as a textbook for courses in QFT, astroparticle physics and cosmology, and students interested in working at the interface between these fields can certainly appreciate the uncommon approach used. It was also the intention of the author to make the book suitable for self study, so all explanations and derivations are given in detail. Nevertheless, a solid knowledge of calculus, classical and quantum mechanics, electrodynamics and special relativity is required.

The post Quantum Fields: From the Hubble to the Planck Scale appeared first on CERN Courier.

]]>
Review This book treats two fields of physics that are usually taught separately – quantum field theory (QFT) on one side and cosmology and gravitation on the other – in a more unified manner. https://cerncourier.com/wp-content/uploads/2022/06/51AZCMnuTEL.jpg
What goes up… Gravity and Scientific Method https://cerncourier.com/a/what-goes-up-gravity-and-scientific-method/ Mon, 09 Jul 2018 14:58:48 +0000 https://preview-courier.web.cern.ch/?p=101353 Carlos Lourenço reviews in 2018 What goes up... Gravity and Scientific Method.

The post What goes up… Gravity and Scientific Method appeared first on CERN Courier.

]]>
By Peter Kosso
Cambridge University Press

Peter Kosso states that his book is “about the science of gravity and the scientific method”; I would say that it is about how scientific knowledge develops over time, using the historical evolution of our understanding of gravity as a guiding thread. The author has been a professor of philosophy and physics, with expert knowledge on how the scientific method works, and this book was born out of his classes. The topic is presented in a clear way, with certain subjects explored more than once as if to ensure that the student gets the point. The text was probably repeatedly revised to remove any wrinkles in its surface and provide smooth reading, setting out a few basic concepts along the way. The downside of this “textbook style” is that it is unexpectedly dry for a book aimed at a broad audience.

As the author explains, a scientific observation must refer to formal terms with universally-agreed meaning, ideally quantifiable in a precise and systematic way, to facilitate the testing of hypotheses. Thinking in the context of a certain theory will specify the important questions and guide the collection of data, while irrelevant factors are to be ignored (Newton’s famous apple could just as well have been an orange, for example). But theoretical guidance comes with the risk that the answers might too easily conform to the expectation and, indeed, the nontrivial give-and-take between theory and observation is a critical part of scientific practice. In particular, the author insists that it is naïve to think that a theory is abandoned or significantly revised as soon as an experimental observation disagrees with the corresponding prediction.

Considering that the scientific method is the central topic of this book, it is surprising to notice that no reference is made to Karl Popper and many other relevant thinkers; this absence is even more remarkable since, on the contrary, Thomas Kuhn is mentioned a few times. One might expect such a book to reflect a basic enlightenment principle more faithfully: the price of acquiring knowledge is that it will be distorted by the conditions of its acquisition, so that keeping a critical mind is a mandatory part of the learning process. For instance, when the reader is told that the advancement of science benefits from the authority of established science (the structural adhesive of Kuhn’s paradigm), it would have been appropriate to also mention the “genetic fallacy” committed when we infer the validity and credibility of an idea from our knowledge of its source. The author could then have pointed the interested reader to suitable literature, one option (among many) being Kuhn vs. Popper; the struggle for the soul of science by Steve Fuller.

What goes up… is certainly an excellent guide to the science of gravity and its historical evolution, from the standpoint of a 21st-century expert. It is interesting, for instance, to compare the “theories of principle” of Aristotle and Einstein with the “constructive theory” of Newton. While Newton started from a wealth of observations and looked for a universal description, unifying the falling apple with the orbiting Moon, Einstein gave more importance to the beauty of the concepts at the heart of relativity than to its empirical success. I enjoyed reading about the discovery of Neptune from the comparison between the precise observations of the orbit of Uranus and the Newtonian prediction, and about the corresponding (unsuccessful) search for the planet Vulcan, supposedly responsible for Mercury’s anomalous orbit until general relativity provided the correct explanation. And it is fascinating to read about the “direct observation” of dark matter in the context of the searches for Neptune and Vulcan. It is important (but surely not easy) to ensure “that a theory is accurate in the conditions for which it is being used to interpret the evidence”, and that it is “both well-tested and independent of any hypothesis for which the observations are used as evidence”.

The text is well written and accessible. My teenage children learned about non-Euclidean geometry from figures in the book and were intrigued by the thought that gravity is not a force field but rather a metric field, which determines the straightest possible lines (geodesics) between two points in space–time. I think, however, that progress in humankind’s understanding of gravity and related topics could be narrated in a more captivating way. People who prefer more vivid and passionate accounts of the lives and achievements of Copernicus, Brahe, Kepler, Galileo, Newton and many others would more likely enjoy The Sleepwalkers by Arthur Koestler or From the Closed World to the Infinite Universe by Alexandre Koyré. I also vehemently recommend chapter one of Only the Longest Threads by Tasneem Zehra Husain, a delightful account of Newton’s breakthrough from the perspective of someone living in the early 18th century.

The post What goes up… Gravity and Scientific Method appeared first on CERN Courier.

]]>
Review Carlos Lourenço reviews in 2018 What goes up... Gravity and Scientific Method. https://cerncourier.com/wp-content/uploads/2022/06/9781107129856i.jpg
Welcome to the Universe https://cerncourier.com/a/welcome-to-the-universe/ Mon, 09 Jul 2018 14:58:47 +0000 https://preview-courier.web.cern.ch/?p=101352 Andrea Giammanco reviews in 2018 Welcome to the Universe.

The post Welcome to the Universe appeared first on CERN Courier.

]]>
by Neil deGrasse Tyson, Michael A Strauss and J Richard Gott
Princeton University Press

It is commonly believed that popular-science books should abstain as much as possible from using equations, apart from the most iconic ones, such as E = mc2. The three authors of Welcome to the Universe boldly defy this stereotype in a book that is intended to guide readers with no previous scientific education from the very basics (the first chapters explain the scientific notation, how to round-up numbers and some trigonometry) to cutting-edge research in astrophysics and cosmology.

This book reflects the content of a course that the authors gave for a decade to non-science majors at Princeton University. They are a small dream team of teachers and authors: Tyson is a star of astrophysics outreach, Strauss a renowned observational astronomer and Gott a theoretical cosmologist with other successful popular-science books to his name. The authors split the content of the book into three equal parts (stars and planets, galaxies, relativity and cosmology), making no attempt at stylistic uniformity. Apparently this was the intention, as they keep their distinct voices and refer frequently to their own research experiences to engage the reader. Despite this, the logical flow remains coherent, with a smooth progression in complexity.

Welcome to the Universe promises and delivers a lot. Non-scientist readers will get a rare opportunity to be taken from a basic understanding of the subject to highly advanced content, not only giving them the “wow factor” (although the authors do appeal to this a lot) but also approaching the same level of depth as a masters course in physics. A representative example is the lengthy derivation of E = mc2, the popular formula that everyone is familiar with but few know how to explain. And while that particular example is probably demanding to the layperson, most chapters are very pleasant to read, with a good balance of narration and analysis. The authors also make a point of explaining why recognised geniuses such as Einstein and Hawking got their fame in the first place. Scientifically-educated readers will find many insights in this volume too.

While I generally praise this book, it does have a few weak points. Some of the explanations are non-rigorous and confusing at the same time (an example of this is the sentence: “the formula has a constant h that quantises energy”). In addition, an entire chapter boasts of the role of one of the authors in the debate on whether Pluto has the status of a planet or not, which I found a bit out of place. But these issues are more irritating than harmful, and overall this book achieves an excellent balance between clarity and accuracy. The authors introduce several original analogies and provide an excellent non-technical explanation of the counterintuitive behaviour of the outer parts of a dying star, which expand while the inner parts contract.

I also appreciated the general emphasis on how measurements are done in practice, including an interesting digression on how Cavendish measured Newton’s constant more than two centuries ago. However, there are places where one feels the absence of such an explanation, for example, the practical limitations of measuring the temperatures of distant bodies are glossed over with a somewhat patronising “all kinds of technical reasons”.

This text comes with a problem book that is a real treasure trove. The exercises proposed are very diverse, reflecting the variety of audiences that the authors clearly target with their book. Some are meant to practice basic competences about units, orders of magnitude and rounding. Others demand readers to think outside of the box (e.g. by playing with geodesics in flatland, we see how to construct an object that is larger inside than outside, and have to estimate its mass using only trigonometry). For some of the quantitative exercises, the solution is provided twice: once in a lengthy way and then in a clever way. People more versed in literature than mathematics will find an exercise that demands you write a scientifically accurate, short science-fiction story (guidelines for grading are offered to the teachers) and one that simply asks, “If you could travel in time, which epoch would you visit and why?”

The book ends with a long and inspiring digression on the role of humans in the universe, and Gott’s suggestion of using the Copernican principle to predict the longevity of civilisations – and of pretty much everything – is definitely food for thought.

The post Welcome to the Universe appeared first on CERN Courier.

]]>
Review Andrea Giammanco reviews in 2018 Welcome to the Universe. https://cerncourier.com/wp-content/uploads/2022/06/51eFaDzKbWL.jpg
Creativity across cultures https://cerncourier.com/a/creativity-across-cultures/ Mon, 09 Jul 2018 10:25:17 +0000 https://preview-courier.web.cern.ch/?p=12503 Data from the ATLAS experiment is a key element in HALO, an important new commission undertaken for Art Basel, the world’s premier fair for contemporary art.

The post Creativity across cultures appeared first on CERN Courier.

]]>
HALO

Lift up your eyes as you walk through the principal entrance to CERN’s main building and you will see a tangled iron coil suspended above the central staircase. Rather like electron orbitals marking out the shape of an atom, the structure’s overlapping lines form hints of something more tangible that changes as you move – a human body. Here, in his sculpture Feeling Material XXXIV, the artist Antony Gormley has spun a chaotic spiralling line that envelopes the body’s space.

Artists, like scientists, have always been keen observers of the world about them and Gormley is no exception. It was his interest in how spaces are delineated that led to his first contacts with CERN physicist Michael Doser in 2006, and ultimately to his donation of Feeling Material XXXIV to the Organization in 2008. Over the years many artists have visited CERN, intrigued by its research; the American performance artist James Lee Byars even featured on the cover of CERN Courier in September 1972. And in the 1990s, British artist and film-maker Ken McMullen visited the laboratory as a result of his friendship with the daughter of the late Maurice Jacob, a well-known CERN theorist. The visit sowed the seeds for a major project, Signatures of the Invisible, based on a collaboration between the London Institute and CERN. This project brought 11 established artists from various countries, including McMullen, to work with scientists and technicians at CERN during 1999–2000, resulting in works of art that were exhibited worldwide (CERN Courier May 2001 p23).

Bello and Kim

The experience proved rewarding for both sides. Writing in the Courier (July 2001, p30), Ian Sexton, the CERN technician who worked with laser cutting and other techniques on McMullen’s piece Crumpled Theory, described his pleasure at seeing the artist’s first sight of the completed work “simply presented on the workshop floor, with sunlight streaming through the blinds. Ken was … delighted. His enthusiasm was a most unusual experience for me. Normally on completion of a job at CERN a perfunctory ‘thank you’ is the only response.”

The project had involved a significant commitment by CERN. The Press Office managed the project on the Organization’s behalf, a number of scientists became deeply involved, and the artists were offered the use of the laboratory’s workshop – all of which implied a great deal of disruption and additional work for those concerned. So perhaps there were reservations in the minds of some at CERN when a new “science and art” initiative began to take shape. In 2009, creative producer Ariane Koek decided to use the award of a Clore Fellowship to come to CERN and – with the encouragement of the Director-General at the time, Rolf Heuer – work out how to establish and fund an artists’ residency scheme. Heuer was suitably impressed by her proposals and the following year, after a selection process, Koek was taken on to set up Arts at CERN.

Cultural policy

These efforts bore fruit in August 2011 with the launch of CERN’s first-ever cultural policy. Its central element is a selection process for arts engagement with CERN, with a cultural board for the arts to advise on projects and collaborations involving CERN. The initiative brought order and direction to what had been an ad-hoc approach to CERN’s involvement with the arts.


The first outwardly visible outcome of Arts at CERN was a competition, Collide at CERN, announced in 2011 and open to artists from anywhere in the world. A key element was to pair winning artists with scientists at CERN during a residency lasting up to three months. One strand in this award – the Prix Electronica Collide @ CERN prize for Digital Arts – was set up in collaboration with Austria-based digital arts organisation Ars Electronica, and the residency consisted of two months at CERN and one month at Ars Electronica’s research and development lab. The second strand – Collide @ CERN Geneva – marked a partnership with the City and Canton of Geneva, and in the first year was for dance and performance.

Antye Greie-Ripatti

The partnership with Ars Electronica was a coup for CERN and the new cultural policy. Over 40 years, Ars Electronica had built up a formidable reputation in bringing artists, scientists and engineers together. Widely publicised by CERN, the partnership was well received in the arts world, but it was perhaps not so well understood at CERN. Was this something that CERN should be doing and who was paying for it all?

Heuer, who was instrumental in initiating Arts at CERN, was always clear on the first point. “The arts and science are inextricably linked; both are ways of exploring our existence, what it is to be human and what is our place in the universe,” he said on launching the cultural policy. Commenting later after three years of successful partnership with Ars Electronica, he wrote: “The level of heated debate about the so-called two-cultures is a constant source of bafflement to me. Of course arts and science are linked. Both are about creativity. Both require technical mastery. And both are about exploring the limits of human potential.”

Feeling Material XXXIV

Regarding the second point, Arts at CERN was conceived from the start to be mainly self-funding. Support from CERN initially came through its programmes for fellows and students. Funding for the original Collide at CERN programme came from Ars Electronica, the City and the Canton of Geneva, and individual private donors, as well as from the UNIQA Insurance Group, which has a long association with CERN and continues to sponsor the Collide programme. Currently, FACT (Foundation for Art and Creative Technology), based in Liverpool in the UK, is the main partner for the Collide International award, while the Republic and Canton of Geneva and the City of Geneva support the Collide Geneva strand. Collide Geneva is now awarded in alternate years with Collide Pro Helvetia, in which artists from across Switzerland can compete for a residency funded principally by Pro Helvetia, the Swiss Arts Council. In addition, via a slightly different scheme called Accelerate, each year ministries or foundations in two different countries fund two artists working in different domains to come to CERN for one month. A further essential strand that has existed from the beginning brings many more artists to the Laboratory. Originally named Visiting Artists, now called Guest Artists, it hosts up to 10 artists a year who are specially selected for a visit of one to two days which they fund themselves. Together these three strands form the Arts at CERN programme.

A new era begins

By autumn 2014, when the call went out for a new person to head Arts at CERN, the programme had already earned a global reputation. Two internationally known artists and recipients of the Collide International award epitomise this reach: Bill Fontana and Ryoji Ikeda. Sound-sculptor Fontana, from the US, had produced works based on sounds across the globe when he was awarded the 2013 international residency. He explored sounds recorded in the LHC tunnel in works such as Acoustic Time Travel (CERN Courier December 2012 p32), and was followed a year later by Ikeda, Japan’s leading electronic composer and visual artist, who used his residency to inform his works supersymmetry and micro|macro.

Quantum

This growing reputation within the science and art scene appealed in particular to the art historian and curator Mónica Bello, who had more than 15 years’ experience in curating and managing cultural programmes in art, science and technology institutions in different countries and had spent five years as artistic director of the VIDA International Art and Artificial Life Awards. Educated in modern and contemporary art history, she had been exposed to new ideas emerging at the boundaries of modern art and become passionate about the fusion between science and art. “I like art that is based on open processes, where different agents – the artists, researchers, even the audience – can join together to become the project,” she explains. “Experimentation with openness is the most exciting thing that’s happening in the arts right now – and CERN is the place to be for that.”

Bello took up her position at CERN in March 2015, joining co-ordinator Julian Caló. In accelerator terms, by the following year, Arts at CERN was already running at its design energy and beyond. The programme was bringing artists to the laboratory for as many as four residencies a year, and the number of entries for the Collide International award had risen from some 400 when it was launched in 2011 to around 1000.

Bello’s main vision is to move the main focus beyond exploration and artistic research towards the further development of new art commissions and exhibitions. Continuing to support the artists once they finish their residencies at CERN is essential for this, and one way is to connect the artists with CERN scientists that have links to the cities of the programmes’ partners. This was initiated with Liverpool, where artists spent a one-month residency at FACT after being at CERN and where they were connected with research groups at Liverpool University led by LHCb physicist Tara Shears. Connecting CERN to international cultural organisations is part of the same objective, through links formed with different cities and countries.

These new developments are fully supported by CERN’s current management. Earlier this year, Director-General Fabiola Gianotti made her views on the “two cultures” clear at the World Economic Forum in Davos: “Too often people put science and humanities, or science and the arts, in different compartments… but they do have much in common. They are the highest expression of the creativity and the curiosity of humanity. We should really talk about culture in general, and not focus on one particular sector of culture. This is an important message we should be giving to teachers and to young people, for a better world, so they can grow to face the challenges of society.”

Semiconductor duo

Arts at CERN currently has a clear home within CERN’s international relations sector. The aim is to provide stability for the programme, with a view to making it self-sustaining with separate funding within the context of the CERN & Society Foundation. At the same time, Arts at CERN forms part of a broader interest at CERN in the arts, which includes a distinctive and complementary programme Arts@CMS. This education and outreach initiative of the CMS collaboration has set up school-based projects and collaborations with artists with the aim of inspiring a greater appreciation of CERN’s science within the public at large.

A new production scheme

Arts at CERN has clearly been a resounding success with the arts community, and reaching audiences beyond the confines of the laboratory has proved no problem at all. Nor has it been difficult to find scientists willing to work with the artists; more than 200 have so far been involved. But it is by no means easy to mount an exhibition at a scientific laboratory – in places almost an industrial site – where health and safety are of paramount importance. There have been some obvious artistic interventions, such as when choreographer Gilles Jobin, winner of the 2012 Collide Geneva award, installed dancers in the CERN restaurant and computer centre, and even the library; and his project Quantum – a fusion of dance and lighting installation developed with Julius von Bismarck, the first Collide International artist-in-residence – debuted in the CMS cavern during CERN’s open days in 2013, before embarking on an international tour (CERN Courier November 2013 p29).


More recently, as part of Geneva’s annual Electron Festival in 2016, the winners of the 2014 Collide Geneva award, Rudy Decelière and Vincent Hänni, showed work developed with experimentalist Robert Kieffer and theorist Diego Blas. Their sound installation Horizons Irrésolus (2016) – which consists of 888 micro-synthesisers and speakers, network cable and nylon thread – was installed at CERN for visits during the Easter weekend when the festival traditionally takes place. The effort required by many people at CERN included registration to allow access to the Meyrin site and a shuttle bus to take visitors to see the installation. The response was enthusiastic, but not large.

It is to address such problems that Bello is developing the production and exhibition stages of the residencies. Rather as a scientific experiment evolves from conception to data-collection, analysis and publication, so does an artistic endeavour evolve from exploration to production and exhibition. The original focus of the residencies at CERN was on exploration: having artists and scientists come together to evolve ideas. In the new phase for Arts at CERN, at the end of their residencies, artists will be invited to propose a work to be considered for additional funding for production. The aim is to collaborate with other institutes to co-produce cultural works for ideas that are worth developing, and to curate the resulting work so that it can be shown and shared with the CERN community.

Jan Peters

In 2016 CERN began a new collaboration for the Collide International award involving FACT, ushering in the production phase. To support production of the artworks, CERN and FACT have brought together several important European cultural organisations under the umbrella of ScANNER (Science and Art Network for New Exhibitions and Research). Supported by ScANNER, a major exhibition will open at FACT in November 2018 showcasing artworks from, among others, the 2018 Collide International award winner and four previous winners:
Semiconductor (2015), Yunchul Kim (2016), studio hrm199 led by Haroon Mirza (2017) and Suzanne Triester (2018). The exhibition will then tour all venues of the ScANNER members during 2019 and 2020.

Meanwhile, Arts at CERN continues to be a major influence across an impressive range of artistic areas. For example, in her project Quantum Nuggets, designer Laura Couto (Collide Pro Helvetia award 2017) has developed a computer program to enable other artists and designers to produce 3D shapes based on collision data from the LHC, thus creating real objects, such as furniture, that echo the invisible quantum world of particle physics. And in February this year Cheolwon Chang (Accelerate Korea) spent time at CERN finding out about the geometric properties of nature and how mathematics influences our further understanding of the universe. The winners of two awards for 2018 were announced in March: Suzanne Treister (Collide International) and Anne Sylvie Henchoz and Julie Lang (Collide Geneva).

Most recently, a prestigious commission for Art Basel held on 11–17 June and guest-curated by Bello, has highlighted the pinnacles that Arts at CERN is reaching. Swiss watchmakers, Audemars Piguet, a partner of Art Basel, chose the British artist-duo Semiconductor to create the Audemars Piguet Art Commission for the 2018 fair. Ruth Jarman and Joe Gerhardt, who work together under the name Semiconductor, were the recipients of the 2015 Collide International award and for Art Basel they created HALO – an installation that surrounds visitors with data collected by the ATLAS experiment at the LHC. HALO consists of a 10 m-wide cylinder defined by vertical piano wires, within which a 4 m-tall screen displays particle collisions. The data also trigger hammers that strike the wires and set up vibrations to create a multisensory experience.

This important commission is testament to the impact that the Arts at CERN programme is having in the world of contemporary art, and underlines its importance in bringing together apparently disparate ways in viewing and making sense of the world, the universe in which we live. There are many people who say they do not appreciate modern art, just as there are many who say that they never liked physics. But with modern art, just as with modern physics, making a little effort can open up remarkable new ways of thinking about our place in space and time. Arts at CERN is very clearly bringing people together in ways that open their minds and allow them not necessarily to understand but to appreciate how others view the world about us.

  • This article was modified on 5 September 2018.

 

The post Creativity across cultures appeared first on CERN Courier.

]]>
Feature Data from the ATLAS experiment is a key element in HALO, an important new commission undertaken for Art Basel, the world’s premier fair for contemporary art. https://cerncourier.com/wp-content/uploads/2018/07/CCJulAug_Artslight.jpg
What is Quantum Information? https://cerncourier.com/a/what-is-quantum-information/ Fri, 01 Jun 2018 15:25:58 +0000 https://preview-courier.web.cern.ch/?p=101394 This book debates the topic of quantum information from both a physical and philosophical perspective, addressing the main questions about its nature.

The post What is Quantum Information? appeared first on CERN Courier.

]]>
By O Lombardi, S Fortin, F Holik and C López (eds.)
Cambridge University Press
CCJune18_Book-lombardi

This book debates the topic of quantum information from both a physical and philosophical perspective, addressing the main questions about its nature. At present, different interpretations of the notion of information coexist and quantum mechanics brings in many puzzles; as a consequence, says the author, there is not yet a generally agreed upon answer to the question “what is quantum information?”.

The chapters are organised in three parts. The first is dedicated to presenting various interpretations of the concept of information and addressing the question of the existence of two qualitatively different kinds of information (classical and quantum). The links between this concept and other notions, such as knowledge, representation, interpretation and manipulation, are discussed as well.

The second part is devoted to the relationship between informational and quantum issues, and deals with the entanglement of quantum states and the notion of pragmatic information. Finally, the third part analyses how probability and correlation underlie the concept of information in different problem domains, as well as the issue of the ontological status of quantum information.

Providing an interdisciplinary examination of quantum information science, this book is aimed at philosophers of science, quantum physicists and information-technology experts who are interested in delving into the multiple conceptual and philosophical problems inherent to this recently born field of research.

The post What is Quantum Information? appeared first on CERN Courier.

]]>
Review This book debates the topic of quantum information from both a physical and philosophical perspective, addressing the main questions about its nature. https://cerncourier.com/wp-content/uploads/2019/03/CCJune18_Book-lombardi.jpg
The Black Book of Quantum Chromodynamics: A Primer for the LHC Era https://cerncourier.com/a/the-black-book-of-quantum-chromodynamics-a-primer-for-the-lhc-era/ Fri, 01 Jun 2018 15:20:46 +0000 https://preview-courier.web.cern.ch/?p=101388 This book provides a comprehensive overview of the physics of the strong interaction, which is necessary to analyse and understand the results of current experiments at particle accelerators.

The post The Black Book of Quantum Chromodynamics: A Primer for the LHC Era appeared first on CERN Courier.

]]>
By J Campbell, J Huston and F Krauss
Oxford University Press

Also available at the CERN bookshop

This book provides a comprehensive overview of the physics of the strong interaction, which is necessary to analyse and understand the results of current experiments at particle accelerators. In particular, the authors aim to show how to apply the framework of perturbative theory in the context of the strong interaction, to the prediction as well as correct interpretation of signals and backgrounds at the Large Hadron Collider (LHC).

The book consists of three parts. In the first, after a brief introduction to the LHC and the present hot topics in particle physics, a general picture of high-energy interactions involving hadrons in the initial state is developed. The relevant terminology and techniques are reviewed and worked out using standard examples.

The second part is dedicated to a more detailed discussion of various aspects of the perturbative treatment of the strong interaction in hadronic reactions. Finally, in the last section, experimental findings are confronted with theoretical predictions.

Primarily addressed at graduate students and young researchers, this book can also be a helpful reference for advanced scientists. In fact, it can provide the right level of knowledge for theorists to understand data more in depth and for experimentalists to be able to recognise the advantages and disadvantages of different theoretical descriptions.

The reader is assumed to be familiar with concepts of particle physics such as the calculation of Feynman diagrams at tree level and the evaluation of cross sections through phase space integration with analytical terms. However, a short review of these topics is given in the appendices.

The post The Black Book of Quantum Chromodynamics: A Primer for the LHC Era appeared first on CERN Courier.

]]>
Review This book provides a comprehensive overview of the physics of the strong interaction, which is necessary to analyse and understand the results of current experiments at particle accelerators. https://cerncourier.com/wp-content/uploads/2019/03/CCJune18_Book-campbell.jpg
In Praise of Simple Physics: The Science and Mathematics behind Everyday Questions https://cerncourier.com/a/in-praise-of-simple-physics-the-science-and-mathematics-behind-everyday-questions/ Fri, 01 Jun 2018 15:20:45 +0000 https://preview-courier.web.cern.ch/?p=101387 In this book, popular-science writer Paul Nahin presents a collection of everyday situations in which the application of simple physical principles and a bit of mathematics can make us understand how things work.

The post In Praise of Simple Physics: The Science and Mathematics behind Everyday Questions appeared first on CERN Courier.

]]>
By Paul J Nahin
Princeton

81altc6UJ4L

In this book, popular-science writer Paul Nahin presents a collection of everyday situations in which the application of simple physical principles and a bit of mathematics can make us understand how things work. His aim is to take these scientific disciplines closer to the layperson and, at the same time, show them the wonder lying behind many aspects of reality that are often taken for granted.

The problems presented and explained are very diverse, ranging from how to extract more energy from renewable sources, how best to catch a baseball, to how to measure gravity in one’s garage and why the sky is dark at night. These topics are treated in an informal and entertaining way, but without waiving the maths. In fact, as the author himself highlights, he is interested in keeping the discussions simple, but not so simple that they are simply wrong. The whole point of the book is actually to show how physics and some calculus can explain many of the things that we commonly encounter.

Engaging and humorous, this text will appeal to non-experts with some background in maths and physics. It is suited to students at any level beyond the last years of high school, as well as to practicing scientists who might discover alternative, clever ways to solve (and explain) everyday physics problems.

The post In Praise of Simple Physics: The Science and Mathematics behind Everyday Questions appeared first on CERN Courier.

]]>
Review In this book, popular-science writer Paul Nahin presents a collection of everyday situations in which the application of simple physical principles and a bit of mathematics can make us understand how things work. https://cerncourier.com/wp-content/uploads/2022/06/81altc6UJ4L.jpg
Calorimetry: Energy Measurement in Particle Physics (2nd edition) https://cerncourier.com/a/calorimetry-energy-measurement-in-particle-physics-2nd-edition/ Fri, 01 Jun 2018 15:12:52 +0000 https://preview-courier.web.cern.ch/?p=101375 Sergio Bertolucci reviews in 2018 Calorimetry: Energy Measurement in Particle Physics.

The post Calorimetry: Energy Measurement in Particle Physics (2nd edition) appeared first on CERN Courier.

]]>
By Richard Wigmans
Oxford Science Publications

When the first edition of this book appeared in 2000, it established itself as “the bible of calorimetry” – not only because of the exhaustive approach to this subtle area of detection, but also because its author enjoyed worldwide recognition within the field. Wigmans gained it thanks to his ground-breaking work on the quantitative understanding of so-called compensating calorimeters (i.e. how to equalise the response of such detectors for electromagnetic and hadronic interactions) and to the leading role he played in designing and operating large detectors that are still considered to be state of the art.

As with the real Bible, which underwent several revisions, this book has been reviewed in depth and published in a second edition. The author has updated it to take into account the last 16 years of progress in the field and to improve its impact as a reference for both students and practitioners.

At first look, one immediately notices that considerable work has been put into improving the quality of the graphics and figures – introducing colours where appropriate – and this new edition is available as an e-book. But there is much more to this updated version.

Chapters two to six, in which the fundamentals of calorimetry are discussed, follow the same thorough structure of the first edition, but they include new insights and use more recent data for illustration, mostly coming from the LHC experiments. Chapters one (Seventy Years of Calorimetry), seven (Performance of Calorimeter Systems) and 11 (Contributions of Calorimetry to the Advancement of Science) have also been brought up to date. Chapters eight, nine and (to a large extent) 10 are brand new and, in my opinion, represent the real added value of this new edition. In particular, chapter eight (New Calorimeter Techniques) discusses the two most relevant innovations introduced in the field during the past decade: dual-readout calorimetry (DRC) and particle-flow analysis (PFA).

The concept of DRC is elaborated upon to circumvent the limitations of compensating hadron calorimeters. Their performances depend crucially on the detection of the abundant contribution of the neutrons produced in the hadronic shower development, which in turn requires the use of heavy absorbers and a small sampling fraction – with the consequent loss of resolution for electromagnetic showers – as well as a relatively large signal-integration time and volume. In DRCs, signals coming from scintillation and Cherenkov processes provide complementary information about the shower development and allow the measurement of the electromagnetic fraction of hadron showers event by event, thus eliminating the effects of fluctuations on calorimeter performance. This concept is discussed in depth and predictions are compared with R&D results on prototypes, providing a convincing experimental demonstration of this novel technique. Although no full-scale calorimeter of this type has been built so far, the results obtained with real detectors, combined with Monte Carlo simulations, have outlined the breakthrough power of this idea, which has all the potential to rival the performances of the best compensating calorimeters, with much better energy resolution for electromagnetic showers. It is very stimulating food for thought for whoever is poised to design next-generation calorimeters.

The other important topic discussed in chapter eight, PFA, is a completely different method that is being used to improve calorimeter performances for jets. It is based on the combined use of a precision tracker and a high-granularity calorimeter, which measures the momentum of charged-jet particles and the energy of neutral particles, respectively. High granularity is mandatory to avoid double counting of the charged particles already measured by the tracker. The topic is treated in great detail, with abundant examples of the application of this technique in real experiments, and its pros and cons are discussed in view of future large-scale detector systems.

As an example, the idea that one can relax the requirements on the calorimeters, since they measure on average only one third of the particles in a jet while the remaining two thirds are very well measured by the tracker, is strongly questioned because the jet-energy resolution would be dominated by the fluctuations in the fraction of the total jet energy that is carried by the charged fragments.

Chapter nine (Analysis and Interpretation of Test Beam Data) is a brand-new addition that I find extremely illuminating and will be valuable for more than just newcomers to the field. By going through it, I have retraced the path of some of my mistakes when dealing with calorimeters, which are complex and subtly deceptive detectors, often exhibiting counterintuitive properties.

Finally, chapter 10 (Calorimeters for Measuring Natural Phenomena) is a tribute to the realisation and successful employment of calorimetric systems to the study of natural phenomena (neutrinos, cosmic rays) in the Antarctica, the Mediterranean Sea and the Argentinian pampa, inside a variety of mountains and deep mines, and in space.

In summary, this second edition of Calorimetry fully meets the ambitious goals of its author: it is a well written and pleasant book, a reference manual for both beginners and experts, and a source of inspiration for future developments in the field.

The post Calorimetry: Energy Measurement in Particle Physics (2nd edition) appeared first on CERN Courier.

]]>
Sergio Bertolucci reviews in 2018 Calorimetry: Energy Measurement in Particle Physics.
The Cosmic Web https://cerncourier.com/a/the-cosmic-web/ Fri, 01 Jun 2018 15:12:51 +0000 https://preview-courier.web.cern.ch/?p=101374 Guido D’Amico reviews in 2018 The Cosmic Web.

The post The Cosmic Web appeared first on CERN Courier.

]]>
By John Richard Gott
Princeton University Press

The observation of the night sky is as old as humankind itself. Cosmology, however, has only achieved the status of “science” in the past century or so. In this book, Gott accompanies the reader through the birth of this new science and our growing understanding of the universe as a whole, starting from the observation by Hubble and others in the 1920s that distant galaxies are receding away from us. This was one of the most important discoveries in the history of science because it shifted the position of humans farther away from the centre of the cosmos and showed that the universe is not eternal, but had a beginning. The philosophical implications were hard to digest, even for Einstein, who invented the cosmological constant such that his equations of general relativity could have a static solution.

Following the first observations of distant galaxies, astronomers began to draw a comprehensive map of the observable universe. They played the same role as the explorers travelling around our planet, except that they could only sit where they were and receive light from distant objects, like the faded photography of a lost past.

After an introduction to the early days of cosmology, the book becomes more personal, and the reader feels drawn in to the excitement of actually doing research. Gott’s account of cosmology is given through the lens of his own research, making the book slightly biased towards the physics of the large-scale structure of the universe, but also more focused and definitely captivating for the reader.

The overarching theme of the book is the quest to understand the shape of the “cosmic web”, which is the distribution of galaxies and voids in a universe that is homogeneous only on very large scales. Tiny fluctuations in the matter density, ultimately quantum in origin, grow via gravity to weave the web.

In graduate school, under the supervision of Jim Gunn, Gott wrote his most cited paper, proposing a mathematical model of the gravitational collapse of small density fluctuations. Here, the readers are given a flavour of the way real research is carried out. The author describes in detail the physics involved in the topic, as well as how the article was born and completed and how it took on a life of its own to become a classic.

The author’s investigation of the large-scale structure intertwines with his passion for topology. He was fascinated by polyhedrons with an infinite number of faces, which were the subject of an award-winning project that he developed in high school and of his first scientific article published in a mathematics journal.

At the time, when astronomical surveys were covering only a small portion of the sky, it was unclear how the cosmic structures assembled. American cosmologists thought that galaxies gathered in isolated clusters floating in a low-density universe, like meatballs in a soup. On the other hand, Soviet scientists maintained that the universe was made up of a connected structure of walls and filaments, where voids appear like holes in a Swiss cheese.

Does the 3D map of the universe resemble a meatball stew or a Swiss cheese? Neither, Gott says. With his collaborators, he proposed that the cosmic web is topologically like a sponge, where voids and galaxy clusters form two interlocking regions, much like the infinite polyhedrons Gott studied in his youth.

The reader is given clear and mathematically precise descriptions of the methods used to demonstrate the idea, which was later confirmed by deeper and larger astronomical observations (in 3D), and by the analysis of the cosmic microwave background (in 2D). By that time, we had the theory of cosmological inflation to explain a few of the puzzles regarding the origin of the universe. Remarkably, inflation predicts tiny quantum fluctuations in the fabric of space–time, giving rise to a symmetry between higher and lower density perturbations, leading to the observed sponge-like topology.

Therefore, by the end of the 20th century, the pieces of our understanding of the universe were falling into place and, in 1998, the discovery that the universe is accelerating allowed us to start thinking about the ultimate fate of the cosmos. This is the subject of the last chapter, an interesting mix of sound predictions (for the next trillion years) and speculative ideas (in a future so far away that it is hard to think about), ending the book with a question – rather than an exclamation – mark.

This is not only a good popular science book that achieves a balance between mathematical precision and a layperson’s intuition. It is also a text about the day-to-day life of a researcher, describing details of how science is actually done, the excitement of discovery and the disappointment of following a wrong path. It is a book for readers curious about cosmology, for researchers in other fields, and for young scientists, who will be inspired by an elder one to pursue the fascinating exploration of nature.

The post The Cosmic Web appeared first on CERN Courier.

]]>
Review Guido D’Amico reviews in 2018 The Cosmic Web. https://cerncourier.com/wp-content/uploads/2022/06/51LJAnR1d7L._SX331_BO1204203200_.jpg
The Standard Model in a Nutshell https://cerncourier.com/a/the-standard-model-in-a-nutshell/ Thu, 19 Apr 2018 15:30:31 +0000 https://preview-courier.web.cern.ch/?p=101398 Luis Alvarez-Gaume reviews in 2018 The Standard Model in a Nutshell.

The post The Standard Model in a Nutshell appeared first on CERN Courier.

]]>
By Dave Goldberg
Princeton University Press
The Standard Model in a Nutshell

This book is an excellent source for those interested in learning the basic features of the Standard Model (SM) of particle physics – also known as the Glashow–Weinberg–Salam (GSW) model – without many technical details. It is a remarkably accessible book that can be used for self learning by advanced undergraduates and beginning graduate students. All the basic building blocks are provided in a self-contained manner, so that the reader can acquire a good knowledge of quantum mechanics and electromagnetism before reaching the boundaries of the SM, which is the theory that best describes our knowledge of the fundamental interactions.

The topics that the book deals with include special relativity, basic quantum field theory and the action principle, continuous symmetries and Noether’s theorem, as well as basic group theory – in particular, the groups needed in the SM: U(1), SU(2) and SU(3). It also covers the relativistic treatment of fermions through the Dirac equation, the quantisation of the electromagnetic field and a first look at the theory of gauge transformations in a familiar context. This is followed by a reasonable account of quantum electrodynamics (QED), the most accurate theory tested so far. The quantisation rules are reviewed with clarity and a number of useful and classic computations are presented to familiarise the reader with the technical details associated with the computation of decay rates, scattering amplitudes, phase-space volumes and propagators. The book also provides an elementary description of how to construct and compute Feynman rules and diagrams, which are later applied to electron–electron scattering and electron–positron annihilation, and how the latter relates to Compton or electron–photon scattering. This lays the basic computational tools to be used later in the sections about electroweak and strong interactions.

At this point, before starting a description of the SM per se, the author briefly describes the historical Fermi model and then presents the main actors. The reader is introduced to the lepton doublet (including the electron, the muon, the tau and their neutrinos), the weak charged and neutral currents, and the vector bosons that carry the weak force (the Ws and the Z). This is followed by an analysis of electroweak unification and the introduction of the weak angle, indicating how the electromagnetic interaction sits inside the weak isospin and hypercharge. Then, the author deals with the quark doublets and the symmetry breaking pattern, using the Brout–Englert–Higgs mechanism, which gives mass to the vector bosons and permits the accommodation of masses for the quarks and leptons. We also learn about the Cabibbo–Kobayashi–Maskawa mixing matrix, neutrino oscillations, charge and parity (CP) violation, the solar neutrino problem, and so on. To conclude, the author presents the SU(3) gauge theory of the strong interactions and provides a description of some theories that go beyond the SM, as well as a short list of important open problems. All this is covered in just over 250 pages: a remarkable achievement. In addition, the book includes many interesting and useful computations.

This work is a very welcome addition to the modern literature in particle physics and I certainly recommend it, in particular for self study. I hope, though, that in the second edition the correct Weinberg is portrayed on p184… an extremely hilarious blunder.

The post The Standard Model in a Nutshell appeared first on CERN Courier.

]]>
Review Luis Alvarez-Gaume reviews in 2018 The Standard Model in a Nutshell. https://cerncourier.com/wp-content/uploads/2018/04/CCMay18_Book-sm.jpg
A Student’s Guide to Dimensional Analysis https://cerncourier.com/a/a-students-guide-to-dimensional-analysis/ Thu, 19 Apr 2018 15:28:35 +0000 https://preview-courier.web.cern.ch/?p=101400 This short book provides an introduction to dimensional analysis, covering its history, methods and formalisation, and shows its application to a number of physics and engineering problems.

The post A Student’s Guide to Dimensional Analysis appeared first on CERN Courier.

]]>
By Don S Lemons
Cambridge University Press

41XEhOa9FfL._SX329_BO1,204,203,200_

Dimensional analysis is a mathematical technique that allows one to deduce the relationship between different physical quantities from the dimensions of the variables involved in the system under study. It provides a method to simplify – when possible – the resolution of complex physical problems.

This short book provides an introduction to dimensional analysis, covering its history, methods and formalisation, and shows its application to a number of physics and engineering problems. As the author explains, the foundation principle of dimensional analysis is essentially a more precise version of the well known rule against “adding apples and oranges”; nevertheless, the successful application of this technique requires physical intuition and some experience. Most of the time it does not lead to the solution of the problem, but it can provide important hints about the direction to take, constraints on the relationship between physical variables and constants, or a confirmation of the correctness of calculations.

After a chapter covering the basics of the method and some historical notions about it, the book offers application examples of dimensional analysis in several areas: mechanics, hydrodynamics, thermal physics, electrodynamics and quantum physics. Through the solution of these real problems, the author shows the possibilities and limitations of this technique. In the final chapter, dimensional analysis is used to take a few steps in the direction of uncovering the dimensional structure of the universe.

Aimed primarily at physics and engineering students in their first university courses, it can also be useful to experienced students and professionals. Being concise and providing problems with solutions at the end of each chapter, the book is ideal for self study.

The post A Student’s Guide to Dimensional Analysis appeared first on CERN Courier.

]]>
Review This short book provides an introduction to dimensional analysis, covering its history, methods and formalisation, and shows its application to a number of physics and engineering problems. https://cerncourier.com/wp-content/uploads/2022/06/41XEhOa9FfL._SX329_BO1204203200_.jpg
A Primer on String Theory https://cerncourier.com/a/a-primer-on-string-theory/ Thu, 19 Apr 2018 15:28:35 +0000 https://preview-courier.web.cern.ch/?p=101401 This textbook aims to provide a concise introduction to string theory for undergraduate and graduate students.

The post A Primer on String Theory appeared first on CERN Courier.

]]>
By Volker Schomerus
Cambridge University Press

816eQ7mV7VL

This textbook aims to provide a concise introduction to string theory for undergraduate and graduate students.

String theory was first proposed in the 1960s and has become one of the main candidates for a possible quantum theory of gravity. While going through alternate phases of highs and lows, it has influenced numerous areas of physics and mathematics, and many theoretical developments have sprung from it.

It was the intention of the author to include in the book just the fundamental concepts and tools of string theory, rather than to be exhaustive. As Schomerus states, there are already various textbooks available that cover this field in detail, from its roots to its most modern developments, but these might be dispersive and overwhelming for students approaching the topic for the first time.

The volume is composed of a brief historical introduction and two parts, each including various chapters. The first part is dedicated to the dynamics of strings moving in a flat Minkowski space. While these string theories do not describe nature, their study is helpful to understand many basic concepts and constructions, and to explore the relation between string theory and field theory on a two-dimensional “world”.

The second part deals with string theories for four-dimensional physics, which can be relevant to the description of our universe. In particular, the motion of superstrings on backgrounds in which some of the dimensions are curled up is studied (this phenomenon is called compactification). This part, in turn, includes three sections devoted to as many subtopics.

First, the author discusses conformal field theory, also dealing with the SU(2) Wess–Zumino–Novikov–Witten model. Then, he passes on to treat Calabi–Yau spaces and the associated string compactification. Finally, he focuses on string dualities, giving special emphasis to the AdS/CFT correspondence and its application to gauge theory.

The post A Primer on String Theory appeared first on CERN Courier.

]]>
Review This textbook aims to provide a concise introduction to string theory for undergraduate and graduate students. https://cerncourier.com/wp-content/uploads/2022/06/816eQ7mV7VL.jpg
Technology Meets Research: 60 Years of CERN Technology, Selected Highlights https://cerncourier.com/a/technology-meets-research-60-years-of-cern-technology-selected-highlights/ Thu, 19 Apr 2018 15:28:34 +0000 https://preview-courier.web.cern.ch/?p=101399 Kenneth Long reviews in 2018 Technology Meets Research: 60 Years of CERN Technology, Selected Highlights.

The post Technology Meets Research: 60 Years of CERN Technology, Selected Highlights appeared first on CERN Courier.

]]>
By Christian Fabjan, Thomas Taylor, Daniel Treille and Horst Wenninger (eds.)
World Scientific
Technology Meets Research: 60 Years of CERN Technology, Selected Highlights

This book, the 27th volume in the “Advanced Series on Directions in High Energy Physics”, presents a robust and accessible summary of 60 years of technological development at CERN. Over this period, the foundations of today’s understanding of matter, its fundamental constituents and the forces that govern its behaviour were laid and, piece by piece, the Standard Model of particle physics was established. All this was possible thanks to spectacular advances in the field of particle accelerators and detectors, which are the focus of this volume. Each of the 12 chapters is built using contributions from the physicists and engineers who played key roles in this great scientific endeavour.

After a brief historical introduction, the story starts with the Synchrocyclotron (SC), CERN’s first accelerator, which allowed – among other things – innovative experiments on pion decay and a measurement of the anomalous magnetic dipole moment of the muon. While the SC was a development of techniques employed elsewhere, the Proton Synchroton (PS), the second accelerator constructed at CERN and now the cornerstone of the laboratory’s accelerator complex, was built using the new and “disruptive” strong-focusing technique. Fast extraction from the PS combined with the van der Meer focussing horn were key to the success of a number of experiments with bubble chambers and, in particular, to the discovery of the weak neutral current using the large heavy-liquid bubble chamber Gargamelle.

The book goes on to present the technological developments that led to the discovery of the Higgs boson by the ATLAS and CMS collaborations at the LHC, and the study of heavy-quark physics as a means to understand the dynamics of flavour and the search for phenomena not described by the SM. The taut framework that the SM provides is evident in the concise reviews of the experimental programme of LEP: the exquisitely precise measurements of the properties of the W and Z bosons, as well as of the quarks and the leptons – made by the ALEPH, DELPHI, OPAL and L3 experiments – were used to demonstrate the internal consistency of the SM and to correctly predict the mass of the Higgs boson. An intriguing insight into the breadth of expertise required to deliver this programme is given by the discussion of the construction of the LEP/LHC tunnel, where the alignment requirements were such that the geodesy needed to account for local variations in the gravitational potential and measurements were verified by observations of the stars.

The rich scientific programme of the LHC and of LEP before it have their roots in the systematic development of the accelerator and detector techniques. The accelerator complex at CERN has grown out of the SC.

The book concisely presents the painstaking work required to deliver the PS, the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS). Experimentation at these facilities established the quark-parton model and quantum chromodynamics (QCD), demonstrated the existence of charged and neutral weak currents, and pointed out weaknesses in our understanding of the structure of the nucleon and the nucleus. The building of the SPS was expedited by the decision to use single-function magnets that enabled a staged approach to its construction. The description of the technological innovations that were required to realise the SPS includes the need for a distributed, user-friendly control-and-monitoring system. A novel solution was adopted that exploited an early implementation of a local-area network and for which a new, interpretative programming language was developed.

The book also describes the introduction of the new isotope separation online technique, which allows highly unstable nuclei to be studied, and its evolution into research on nuclear matter in extreme conditions at ISOLDE and its upgrades. The study of heavy-ion collisions in fixed target experiments at the SPS collider and now in the ALICE experiment at the LHC, has its roots in the early nuclear-physics programme as well. The SC, and later the PS, were ideal tools to create the intense low-energy beams used to test fundamental symmetries, to search for rare decays of hadrons and leptons, and to measure the parameters of the SM.

Reading this chronicle of CERN’s outstanding record, I was struck by its extraordinary pedigree of innovation in accelerator and detector technology. Among the many examples of groundbreaking innovation discussed in the book is the construction of the ISR which, by colliding beams head on, opened the path to today’s energy

frontier. The ISR programme created the conditions for pioneering developments such as the multi-wire proportional chamber, and the transition radiation detector as well as large-acceptance magnetic spectrometers for colliding-beam experiments. Many of the technologies that underpin the success of the proton–antiproton (Spp S) collider, LEP and the LHC, were innovations pioneered at the ISR. For example, the discovery of the W and Z bosons at the Spp S relied on the demonstration of stochastic cooling and antiproton accumulation. The development of these techniques allowed CERN to establish its antiproton programme, which encompassed the search for new phenomena at the energy frontier, as well as the study of discrete symmetries using neutral kaons at CPLEAR and the detailed study of the properties of antimatter.

The volume includes contributions on the development of the computing, data-handling and networking systems necessary to maximise the scientific output of the accelerator and detector facilities. From the digitisation and handling of bubble- and spark-chamber images in the SC era, to the distributed processing possible on the worldwide LHC computing grid, the CERN community has always developed imaginative solutions to its data-processing needs.

The book concludes with thoughtful chapters that describe the impact on society of the technological innovations driven by the CERN programme, the science and art of managing large, technologically challenging and internationally collaborative projects, and a discussion of the R&D programme required to secure the next 60 years of discovery.

The contributions from leading scientists of the day collected in this relatively slim book document CERN’s 60-year voyage of innovation and discovery, the repercussions of which vindicate the vision of those who drove the foundation of the laboratory – European in constitution, but global in impact. The spirit of inclusive collaboration, which was a key element of the original vision for the laboratory, together with the aim of technical innovation and scientific excellence, are reflected in each of the articles in this unique volume.

The post Technology Meets Research: 60 Years of CERN Technology, Selected Highlights appeared first on CERN Courier.

]]>
Review Kenneth Long reviews in 2018 Technology Meets Research: 60 Years of CERN Technology, Selected Highlights. https://cerncourier.com/wp-content/uploads/2018/04/CCMay18_Book-tech.jpg
Tales of TRIUMF https://cerncourier.com/a/tales-of-triumf/ Thu, 19 Apr 2018 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/tales-of-triumf/ Founded 50 years ago to meet research needs that no single university could provide,
Canada’s premier accelerator laboratory continues to drive discoveries.

The post Tales of TRIUMF appeared first on CERN Courier.

]]>

The TRIUMF laboratory’s 50-year legacy is imprinted on its 13-acre campus in Vancouver; decades-old buildings of cinderblock and corrugated steel sit alongside new facilities housing state-of-the-art equipment. With each new facility, the lab continues its half-century journey from a regional tri-university meson facility (from where the acronym TRIUMF comes) to a national and international hub for science.

At the laboratory’s centre is the original 520 MeV cyclotron, a negative-hydrogen-ion accelerator so well engineered when it was first built that it continues to function (albeit with updated controls and electronics) as TRIUMF’s heart. Over the past 50 years, the TRIUMF cyclotron has spurred the growth of a diverse and multidisciplinary community whose ideas continue to coax new uses from the decades-old accelerator. These new applications serve to continuously redefine TRIUMF as an institution: a superconducting linear accelerator that complements the original cyclotron; 17 universities and counting that have joined the original trio; and an expanding network of collaborators that now spans the globe.

TRIUMF, which began with a daring idea and a simple patch of rainforest on the University of British Columbia’s (UBC) south campus, is this year reflecting on its rich past, its vibrant present and the promise of a bright future.

The tri-university meson facility

The first inklings for the tri-university meson facility were themselves a product of three separate elements: a trio of Canadian universities, a novel accelerator concept and an appetite for collaboration within the field of nuclear physics in the early 1960s. And the researchers involved were well positioned to develop such a proposal. John Warren, at that time head of the nuclear-physics group at UBC, had established a team of remarkable graduate students while constructing a 3 MeV Van de Graaff accelerator. Erich Vogt had just transitioned to the UBC physics faculty from an illustrious career as a theoretical nuclear physicist at the Chalk River Laboratory in Ontario. And finally, J Reginald Richardson, a Canadian-born physicist at the University of California in Los Angeles (UCLA), who had finalised a concept for a sector-focused, spiral-ridge negative-hydrogen-ion cyclotron – many of the ideas for which came while holidaying at his cottage on Galiano Island on the West Coast of British Columbia. In the years that followed, all three of them would go on to become a director of TRIUMF.

At the time, the world was ready to dig deeper into nuclear structure and explore other hadronic mysteries using powerful meson beams. This push for “meson factories” led to LAMPF in the US, SIN (now PSI) in Switzerland and, eventually, TRIUMF in Vancouver.

In 1964, a young physicist named Michael Craddock (who would become a long-time CERN Courier contributor) completed his PhD in nuclear physics at the University of Oxford in the UK before joining the UBC physics faculty. In June 1965, Craddock attended a meeting between representatives of UBC, the University of Victoria and Simon Fraser University, and wrote a summary of the proceedings: an agreement to develop a proposal for a tri-university meson facility based on the Richardson negative-hydrogen-ion cyclotron. Not three years later, in April 1968, the group received $19 million CDN in federal funding and construction began.

Warren presided as TRIUMF’s first director, and many of the accelerator’s build team came from his Van de Graaff graduate students. The initial organisation consisted of a university faculty member directing the engineers and consultants responsible for each of the main components of the cyclotron: its ion source, radio-frequency, magnet and vacuum systems. Joop Burgerjon, the engineer for the construction of the 50 MeV negative-hydrogen-ion cyclotron at the University of Manitoba, which was itself a copy of the 50 MeV UCLA cyclotron, became the chief engineer for TRIUMF.

Ewart Blackmore (one of this article’s authors) was one of Warren’s graduate students who was brought back to work on the accelerator design and construction. In 1968, while working as a postdoctoral fellow at what is now the Rutherford Appleton Laboratory, in the UK, Blackmore and another postdoctoral fellow, David Axen (also a former UBC graduate student) received a call to coordinate an experiment to measure the dissociation rate of negative hydrogen ions in a magnetic field. This is an important parameter for setting the maximum magnetic field of the cyclotron. The measurement used the proton linear accelerator at the Rutherford laboratory and resulted in a higher dissociation rate than expected from earlier experiments, increasing the size of the cyclotron by 4%.

Upon his return to Vancouver, Blackmore shouldered the responsibility for the cyclotron’s injection, beam diagnostics and extraction systems. All of these components and more were put to the test with a full-scale model of the cyclotron’s centre core, which achieved first beam in 1972. Finally, despite a six-month delay to reshape the magnetic field produced by the 4000 tonne magnet, the TRIUMF team of about 160 physicists, engineers and technical staff coaxed a beam of protons from the cyclotron on 15 December 1974. TRIUMF’s scientific programme began the following year with an initial complement of experimental beamlines: proton, neutron, pion and muon. In the end, the project was on budget and very near the original schedule. The machine reached its design current of 100 μA in 1977, with Blackmore coordinating the first five years of commissioning and operations. He recalls that it was a remarkable experience to witness the moment first beam was achieved from the cyclotron. “At the start of it all, most of us had little understanding of cyclotrons and related technologies, but we had the valuable experience we had gained as graduate students.”

International physics hub

The story of TRIUMF quickly developed, the lab reinventing itself time and again to keep up with the fast pace at which the field was evolving. By the early 1980s, TRIUMF was a well-established accelerator laboratory that operated the world’s largest cyclotron. In those days, TRIUMF utilised proton and neutron beams to drive a powerful research programme in nucleon–nucleon/nucleus interaction studies, muon beams for muon-spin-rotation experiments in material sciences, pion beams for nuclear-structure studies, and meson beams for precision electroweak experiments.

However, as the field advanced, new discoveries in meson science were changing the landscape. TRIUMF responded by proposing an even-larger accelerator system, the 30 GeV KAON (Kaons, Antiprotons, Other hadrons and Neutrinos) complex. When fully complete, KAON would have allowed cutting-edge high-energy-physics experiments at the intensity frontier. It was a bold proposal that garnered substantial national and international interest but ultimately did not find enough political support to be funded. Nevertheless, the concept itself was considered visionary, and the science that TRIUMF wanted to enact was taken up decades later in modified forms at the J-PARC complex in Japan and the upcoming FAIR facility in Germany (CERN Courier July/August 2017 p41).

The loss of KAON forced an existential crisis on TRIUMF, and the laboratory responded in two parallel directions. Firstly, TRIUMF expanded Canada’s contributions to international physics collaborations. During the decade-long campaign to design KAON, TRIUMF had developed an impressive array of scientific and engineering talent and capabilities in the design of accelerators, production targets and detectors. This enabled the Canadian physics community – supported by TRIUMF – to contribute to CERN’s Large Hadron Collider (LHC) and join the ATLAS collaboration, building components such as the warm twin-aperture quadrupoles for the LHC and the hadronic endcap calorimeter for ATLAS. This positioned TRIUMF as Canada’s gateway to international subatomic physics and paved the way for Canada’s contributions to other major physics collaborations like T2K in Japan.

The laboratory’s second response to the loss of KAON was the development at TRIUMF of a new scientific programme centred on rare isotopes. By the 1980s, the field of rare isotopes had become of burgeoning interest, opening new avenues of research for TRIUMF into nuclear astrophysics, fundamental nuclear physics and low-energy precision probes of subatomic symmetries. TRIUMF had recognised the worldwide shortage of isotope production facilities and understood the role it could play in rectifying the situation. The lab already possessed expertise in beam and target physics, design and engineering — and, since its inception, a high-powered 520 MeV cyclotron that could act as a beam driver for producing exotic isotopes.

Rare-isotope beams at TRIUMF started during the KAON era with the small TISOL (Test facility of Isotope Separation On-Line) project in 1987, which used an isotope-separation concept developed at CERN’s ISOLDE facility. Experience at TISOL gave its proponents confidence that a much-larger-scale rare-isotope-beams facility could be built at TRIUMF. And so the Isotope Separator and Accelerator (ISAC) era was born at TRIUMF. Today, TRIUMF’s ISAC boasts the highest production power of any ISOL-type facility and some of the highest rates of rare-isotope production in the world. ISAC enables TRIUMF to produce isotopes for a variety of research areas, including studies of the formation of the heavy chemical elements in the universe, exploration of phenomena beyond the Standard Model of particle physics and inquiry into the deepest secrets of the atomic nucleus. In addition, spin-polarised beta-emitting isotopes produced at TRIUMF make possible detailed probes for surface and interface studies in complex quantum materials or novel batteries, benefiting the molecular- and materials-science communities.

TRIUMF is continuing to build on its expertise and capabilities in isotope science by adding new rare-isotope production facilities to supply the laboratory’s existing experimental stations. A new project, ARIEL (the Advanced Rare Isotope Laboratory), will add two rare-isotope production stations driven by a new proton beamline from the cyclotron and a new electron beamline from a new superconducting linear accelerator (designed and built in Canada). ARIEL will triple the output of the science programme based on rare-isotope beams, creating new opportunities for innovation and allowing the lab to branch off into promising new areas, even outside of subatomic physics, materials science and nuclear astrophysics. Although ARIEL’s completion date is set for 2023, the facility’s multi-stage installation will allow the TRIUMF community to begin scientific operations as early as 2019.

An innovation lab

TRIUMF’s history is defined not only by a drive to push the frontiers of science and discovery, but also those of innovation. The flexibility of the iconic cyclotron at the heart of TRIUMF’s scientific programme has allowed the lab to venture into areas that few could have imagined at the time of its original proposal. Standing on the shoulders of its founders, TRIUMF’s community now turns to the next half-century and beyond, and asks: how can TRIUMF increase its impact on our everyday lives?

While fundamental research remains core to TRIUMF’s mission, the laboratory has long appreciated the necessity and opportunity for translating its technologies to the benefit of society. TRIUMF Innovations, the lab’s commercialisation arm, actively targets and develops new opportunities for collaboration and company creation surrounding the physics-based technologies that emerge from the TRIUMF network.

Perhaps the most long-standing of these collaborations is TRIUMF’s more than 30-year partnership with the global health-science company Nordion. A team of TRIUMF scientists, engineers and technicians works with Nordion to operate TRIUMF cyclotrons to produce commercial medical isotopes that are used in diagnosing cancer and cardiac conditions. During the course of this partnership, more than 50 million patient doses of medical isotopes have been produced at TRIUMF and delivered to patients around the world.

Another outcome of TRIUMF Innovations is ARTMS Products Inc, which produces cyclotron-target technology enabling cleaner and greener manufacturing of medical isotopes within local hospitals. ARTMS has already secured venture-capital funding and multiple successful installations are under way around the world. Its technology for producing the most commonly used medical isotope, technetium-99, will help stabilise the global isotope supply chain in the wake of the shutdown of the Chalk River reactor facility.

TRIUMF Innovations will play a key role in fostering industry relationships enabled by the future Institute for Advanced Medical Isotopes (IAMI), a critical piece of infrastructure that will advance nuclear medicine in Canada. Supported by TRIUMF’s life sciences division, IAMI will provide infrastructure and expertise towards developing new diagnostics and radiotherapies. IAMI will also provide industry partners with facilities to study and test new isotopes and radiopharmaceuticals that hold great promise for improving the health of patients in Canada and around the world.

Similarly, TRIUMF and TRIUMF Innovations are also working to support the emerging field of targeted alpha-emitting therapeutics — radiotherapy medicines that hold new promise for patients who have been diagnosed with advanced and life-threatening metastasised cancers. Multiple new companies are currently developing novel treatments, but all are hampered by a global shortage of actinium-225 (225Ac), a hard-to-produce isotope at the core of many of these therapies. The TRIUMF cyclotron is unmatched in 225Ac production capacity, and the laboratory is working with researchers and industry partners to bring this production online and to speed up the development of new therapies with the potential to offer new hope to patients with cancers that are currently deemed incurable.

Beyond these developments, TRIUMF Innovations manages a portfolio of TRIUMF products and services that range from providing irradiation services for stress-testing communications and aerospace technologies to improving the efficacy and safety of mining exploration using muon detectors to help geologists estimate the size and location of ore deposits.

In the coming years, TRIUMF Innovations will continue to advance commercialisation both within TRIUMF and through TRIUMF’s networks. For example, TRIUMF is now seeking to develop a new data-science hub to connect its 20 member universities and global research partners to private-sector training opportunities and new quantum-computing tools. Drawing on data-science acumen developed through the ATLAS collaboration, TRIUMF is building industry partnerships that train academic researchers to use their data-science skills in the private sector and connect them with new research and career opportunities.

It is clear that TRIUMF’s sustained focus on commercialisation and collaboration will ensure that the lab continues to bring the benefits of accelerator-based science into society and to pursue world-leading science with impact.

The quest continues

Fifty years in, TRIUMF’s narrative is a continuous work in progress, a story unfolding beneath the mossy boughs of the same fir and alder trees that looked down on the first shovel strike, the first sheet of concrete, the first summer barbecue. In the coming years, the lab will continue to welcome fresh faces, to upgrade and add new facilities, to broach new frontiers, and to confront new challenges. It is difficult to predict exactly where the next era of TRIUMF will lead, but if there is one thing we can be sure of, it is that TRIUMF’s community of discoverers and innovators will be exploring ideas and seeking out new frontiers for years to come.

The post Tales of TRIUMF appeared first on CERN Courier.

]]>
Feature Founded 50 years ago to meet research needs that no single university could provide, Canada’s premier accelerator laboratory continues to drive discoveries. https://cerncourier.com/wp-content/uploads/2018/06/CC-May18-triumf-image1.jpg
Introduction to Accelerator Dynamics https://cerncourier.com/a/introduction-to-accelerator-dynamics/ Fri, 23 Mar 2018 15:47:33 +0000 https://preview-courier.web.cern.ch/?p=101415 This concise book provides an overview of accelerator physics, a field that has grown rapidly since its inception and is progressing in many directions.

The post Introduction to Accelerator Dynamics appeared first on CERN Courier.

]]>
By Stephen Peggs and Todd Satogata
Cambridge University Press

9781107132849 feature

This concise book provides an overview of accelerator physics, a field that has grown rapidly since its inception and is progressing in many directions. Particle accelerators are becoming more and more sophisticated and rely on diverse technologies, depending on their application.

With a pedagogical approach, the book presents both the physics of particle acceleration, collision and beam dynamics, and the engineering aspects and technologies that lay behind the effective construction and operation of these complex machines. After a few introductory theoretical chapters, the authors delve into the different components and types of accelerators: RF cavities, magnets, linear accelerators, etc. Throughout, they also show the connections between accelerator technology and the parallel development of computational capability.

This text is aimed at university students at graduate or late undergraduate level, as well as accelerator users and operators. An introduction to the field, rather than an exhaustive treatment of accelerator physics, the book is conceived to be self-contained (to a certain extent) and to provide a strong starting point for more advanced studies on the topic. The volume is completed by a selection of exercises at the end of each chapter and an appendix with important formulae for accelerator design.

The post Introduction to Accelerator Dynamics appeared first on CERN Courier.

]]>
Review This concise book provides an overview of accelerator physics, a field that has grown rapidly since its inception and is progressing in many directions. https://cerncourier.com/wp-content/uploads/2022/06/9781107132849-feature.jpg
Data Analysis Techniques for Physical Scientists https://cerncourier.com/a/data-analysis-techniques-for-physical-scientists/ Fri, 23 Mar 2018 15:47:33 +0000 https://preview-courier.web.cern.ch/?p=101416 This textbook aims to present all of the basic statistics tools required for data analysis, not only in particle physics but also astronomy and any other area of the physical sciences.

The post Data Analysis Techniques for Physical Scientists appeared first on CERN Courier.

]]>
By Claude A Pruneau
Cambridge University Press

9781108416788 feature

Also available at the CERN bookshop

Since the analysis of data from physics experiments is mainly based on statistics, all experimental physicists have to study this discipline at some point in their career. It is common, however, for students not to learn it in a specific advanced university course but in bits and pieces during their studies and subsequent career.

This textbook aims to present all of the basic statistics tools required for data analysis, not only in particle physics but also astronomy and any other area of the physical sciences. It is targeted towards graduate students and young scientists and, since it is not intended as a text for mathematicians or statisticians, detailed proofs of many of the theorems and results presented are left out.

After a philosophical introduction on the scientific method, the text is presented in three parts. In the first, the foundational concepts and methods of probability and statistics are provided, considering both the frequentist and Bayesian interpretations. The second part deals with the basic and most commonly used advanced techniques for measuring particle-production cross-sections, correlation functions and particle identification. Much attention is also given to the notions of statistical and systematic errors, as well as the methods used to unfold or correct data for the instrumental effects associated with measurements. Finally, in the third section, introductory techniques in Monte Carlo simulations are discussed, focusing on their application to experimental data interpretation.

The post Data Analysis Techniques for Physical Scientists appeared first on CERN Courier.

]]>
Review This textbook aims to present all of the basic statistics tools required for data analysis, not only in particle physics but also astronomy and any other area of the physical sciences. https://cerncourier.com/wp-content/uploads/2022/06/9781108416788-feature.jpg
Ripples in spacetime https://cerncourier.com/a/ripples-in-spacetime/ Fri, 23 Mar 2018 15:47:32 +0000 https://preview-courier.web.cern.ch/?p=101413 Guillermo Ballesteros reviews in 2018 Ripples in Spacetime.

The post Ripples in spacetime appeared first on CERN Courier.

]]>
By Govert Schilling
The Belknap Press of Harvard University Press
CCApr18_Book-ripples

In February 2016 the LIGO and Virgo collaborations announced the first detection of gravitational waves from the collision of two black holes. It was a splendid result for a quest that started about five decades ago with the design and construction of small prototypes of laser interferometers. Since this first discovery, at least five other binary black-hole mergers have been found and gravitational waves from two colliding neutron stars have also been detected. Gravitational-wave science is now booming, literally, and will continue to do so for a long time. The upcoming observational progress in this field will impact the development of astrophysics, cosmology and, perhaps, particle physics.

Govert Schilling is an award-winning science journalist with a special interest in astronomy and space science. In this book, he guides the reader through the development of gravitational-wave astronomy, from its very origin deep in the early days of general relativity up to the first LIGO discovery. He does so, not only by delving into the key moments of this wonderful piece of history, but also by explaining the main physical and engineering ideas that made it possible.

Moreover, Schilling does a very good job discussing the scientific context in which these events and ideas arose. Far from being a mere collection of events, the book offers the reader a journey that goes beyond its title, exploring and connecting topics such as the cosmic-microwave background and its polarisation, radioastronomy and pulsars, supernovae, primordial inflation, gamma-ray bursts and even dark energy. In addition, the last few chapters of the book discuss the science that may come next, when new interferometers will join LIGO and Virgo in this adventure, observing the sky from Earth (e.g. KAGRA) and space (LISA).

The book clearly aims to target a non-specialist readership and will surely be enjoyed by people lacking a prior knowledge of astrophysics, gravitational waves or cosmology. However, this does not mean that readers more well-versed in these topics will find the book uninspiring. Schilling addresses the reader in a direct, entertaining, almost colloquial manner, managing to explain complex concepts in a few paragraphs while keeping the science sound. Besides, the book gives an interesting (and sometimes surprising) glimpse into the lives, aspirations and mutual interactions of the scientific pioneers in the field of gravitational waves.

If an objection had to be found, it would be that in the first chapter the author belittles general relativity by introducing it as “the theory behind [the movie] Interstellar”. If this scares you, read on and fear nothing. As always happens, science outshines fiction, and the rest of the book proves why this is so.

The post Ripples in spacetime appeared first on CERN Courier.

]]>
Review Guillermo Ballesteros reviews in 2018 Ripples in Spacetime. https://cerncourier.com/wp-content/uploads/2018/03/CCApr18_Book-ripples.jpg
Natural Complexity: A Modeling Handbook https://cerncourier.com/a/natural-complexity-a-modeling-handbook/ Fri, 23 Mar 2018 15:47:32 +0000 https://preview-courier.web.cern.ch/?p=101414 This book aims to introduce readers to the study of complex systems with the help of simple computational models.

The post Natural Complexity: A Modeling Handbook appeared first on CERN Courier.

]]>
By Paul Charbonneau
Princeton University Press

91TM2EXZIcL

This book aims to introduce readers to the study of complex systems with the help of simple computational models. After showing how difficult it is to define complexity, the author explains that complex systems are an idealisation of naturally occurring phenomena in which the macroscopic structures and patterns generated are not directly controlled by processes at the macroscopic level but arise instead from dynamical interactions at the microscopic level. This kind of behaviour characterises a range of natural phenomena, from avalanches to earthquakes, solar flares, epidemics and ant colonies.

In each chapter the author introduces a simple computer-based model for one such complex phenomenon. As the author himself states, such simplified models wouldn’t be able to reliably foresee the development of a real natural phenomenon, thus they are to be taken as complementary to conventional approaches for studying such systems.

Meant for undergraduate students, the book does not require previous experience in programming and each computational model is accompanied by Python code and full explanations. Nevertheless, students are expected to learn how to modify the code to tackle the problems included at the end of each chapter. Three appendices provide a review of Python programming, probability density functions and other useful mathematical tools.

The post Natural Complexity: A Modeling Handbook appeared first on CERN Courier.

]]>
Review This book aims to introduce readers to the study of complex systems with the help of simple computational models. https://cerncourier.com/wp-content/uploads/2022/06/91TM2EXZIcL.jpg
Fashion, Faith and Fantasy in the New Physics of the Universe https://cerncourier.com/a/fashion-faith-and-fantasy-in-the-new-physics-of-the-universe/ Fri, 23 Mar 2018 15:47:31 +0000 https://preview-courier.web.cern.ch/?p=101412 Wolfgang Lerche reviews in 2018 Fashion, Faith and Fantasy in the New Physics of the Universe.

The post Fashion, Faith and Fantasy in the New Physics of the Universe appeared first on CERN Courier.

]]>
By Roger Penrose
Princeton University Press

Also available at the CERN bookshop

The well-known mathematician and theoretical physicist Roger Penrose has produced another popular book, in which he gives a critical overview of contemporary fundamental physics. The main theme is that modern theoretical physics is afflicted by an overdose of fashion, faith and fantasy, which supposedly has led recent research astray.

There are three major parts of the book to which these three f-words relate, corresponding one-to-one with some of the most popular research areas in fundamental physics. The first part, labelled “fashion”, deals with string theory. “Faith” refers to the general belief in the correctness of quantum mechanics, while “fantasy” is the verdict for certain scenarios of modern cosmology.

The book starts with an overview of particle physics as a motivation for string theory and quickly focuses on its alleged shortcomings, most notably extra dimensions. Well-known criticisms, for instance linked to the multitude of solutions (“landscape of vacua”) of string theory or the postulate of supersymmetry, follow in due course. This material is mostly routine, but there are also previously unheard of concerns such as the notion of “too much functional freedom” or doubts about the decoupling of heavy string states (supposedly excitable, for example from the orbital kinetic energy of Earth).

Next the book turns to quantum mechanics and gives an enjoyable introduction to some of the key notions, such as superposition, spin, measurement and entanglement. The author emphasises, with great clarity, some subtle points such as how to understand the quantum mechanical superposition of space–times. In doing so, he raises some concerns and argues – quite unconventionally – that, to resolve them, it is necessary to modify quantum mechanics. In particular he asks that the postulate of linearity should be re-assessed in the presence of gravity.

The fantasy section gives an exposition of the key ideas of cosmology, in particular of all sorts of scenarios of inflation, big bang, cyclic universes and multiverses. This is all very rewarding to read, and particularly brilliant is the presentation of cosmological aspects of entropy, the second law of thermodynamics and the arrow of time. I consider this third section as the highlight of the book. The author does not hide his suspicion that many of these scenarios should not be trusted and dismisses them as crazy – while saying, as if with a twinkle in the eye: not crazy enough!

There is a brief, additional, final section that has a more personal and historical touch, and which tries to make a case for Penrose’s own pet theory: twistor theory. One cannot but feel that some of his resentment against string theory stems from a perceived under-appreciation of twistor theory. In particular, the author admits that his aversion to string theory comes almost entirely from its purported extra dimensions, whereas twistors work primarily in four dimensions.

This touches upon a weak point of the book: the author argues entirely from the direction of classical geometry, and so shares a fixation with extra dimensions in string theory with many other critics. What Penrose misses, however, is that these provide an elegant way to represent certain internal degrees of freedom (needed matter fields). But this is by no means generic – on the contrary, most string backgrounds are non-geometric. For example, some are better described by a bunch of Ising models with no identifiable classical geometry at all, so the agony of how to come to grips with such “compactified” dimensions turns into a non-issue. The point is that due to quantum dualities, there is, in general, no unambiguous objective reality of string “compactification” spaces, and criticism that does not take this “stringy quantum geometry” properly into account is moot.

Somewhat similar in spirit is the criticism of quantum mechanics, which according to Penrose should be modified due to an alleged incompatibility with gravity. Today most researchers would take the opposite point of view and consider quantum mechanics as fundamental, while gravity is a derived, emergent phenomenon. This viewpoint is strongly supported by the gauge-gravity duality and its recent offspring in terms of space–time geometry arising via quantum entanglement.

All-in-all, this book excels by covering a huge range of concepts from particle physics to quantum mechanics to cosmology, presented in a beautifully clear and coherent way (spiced up with many drawings), by an independent and truly deep-thinking master of the field. It also sports a considerable number of formulae and uses mathematical concepts (like complex analysis) that a general audience would probably find difficult to deal with; there are a number of helpful appendices for non-experts, though.

Thus, Fashion, Faith and Fantasy in the New Physics of the Universe seems to be suitable for both physics students and experienced physicists alike, and I believe that either group will profit from reading it, if taken with a pinch of salt. This is because the author criticises contemporary fundamental theories through his personal view as a classical relativist, and in doing so falls short when taking certain modern viewpoints into account.

The post Fashion, Faith and Fantasy in the New Physics of the Universe appeared first on CERN Courier.

]]>
Review Wolfgang Lerche reviews in 2018 Fashion, Faith and Fantasy in the New Physics of the Universe. https://cerncourier.com/wp-content/uploads/2018/03/CCApr18_Book-fashion.jpg
Physics of Atomic Nuclei https://cerncourier.com/a/physics-of-atomic-nuclei/ Fri, 16 Feb 2018 16:01:01 +0000 https://preview-courier.web.cern.ch/?p=101428 This new textbook of nuclear physics aims to provide a review of the foundations of this branch of physics as well as to present more modern topics, including the important developments of the last 20 years.

The post Physics of Atomic Nuclei appeared first on CERN Courier.

]]>
By Vladimir Zelevinsky and Alexander Volya Wiley
CCboo4_02_18

This new textbook of nuclear physics aims to provide a review of the foundations of this branch of physics as well as to present more modern topics, including the important developments of the last 20 years. Even though well-established textbooks exist in this field, the authors propose a more comprehensive essay for students who want to go deeper both in understanding the basic principles of nuclear physics and in learning about the problems that researchers are currently addressing. Indeed, a renewed interest has lately revitalised this field, following the availability of new experimental facilities and increased computational resources.

Another objective of this book, which is based on the lectures and teaching experience of the authors, is to clarify, at each step, the relationship between theoretical equations and experimental observables, as well as to highlight useful methods and algorithms from computational physics.

The last few chapters cover topics not normally included in standard courses of nuclear physics, and reflect the scientific interests – and occasionally the point of view – of the authors. Many problems are also provided at the end of each chapter, and some of them are fully solved.

Compiled by Virginia Greco, CERN.

The post Physics of Atomic Nuclei appeared first on CERN Courier.

]]>
Review This new textbook of nuclear physics aims to provide a review of the foundations of this branch of physics as well as to present more modern topics, including the important developments of the last 20 years. https://cerncourier.com/wp-content/uploads/2018/02/CCboo4_02_18.jpg
String Theory Methods for Condensed Matter Physics https://cerncourier.com/a/string-theory-methods-for-condensed-matter-physics/ Fri, 16 Feb 2018 16:01:01 +0000 https://preview-courier.web.cern.ch/?p=101429 This book provides an introduction to various methods developed in string theory to tackle problems in condensed-matter physics.

The post String Theory Methods for Condensed Matter Physics appeared first on CERN Courier.

]]>
By Horatiu Nastase
Cambridge University Press
CCboo3_02_18

This book provides an introduction to various methods developed in string theory to tackle problems in condensed-matter physics. This is the field where string theory has been most largely applied, thanks to the use of the correspondence between anti-de Sitter spaces (AdS) and conformal field theories (CFT). Formulated as a conjecture 20 years ago by Juan Maldacena of the Institute for Advanced Study, the AdS/CFT correspondence relates string theory, usually in its low-energy version of supergravity and in a curved background space–time, to field theory in a flat space–time of fewer dimensions. This correspondence is holographic, which means in some sense that the physics in the higher dimension is projected onto a flat surface without losing information.

The book is articulated in four parts. In the first, the author introduces modern topics in condensed-matter physics from the perspective of a string theorist. Part two gives a basic review of general relativity and string theory, in an attempt to make the book as self-consistent as possible. The other two parts focus on the applications of string theory to condensed-matter problems, with the aim of providing the reader with the tools and methods available in the field. Going into more detail, part three is dedicated to methods already considered as standard – such as the pp-wave correspondence, spin chains and integrability, AdS/CFT phenomenology and the fluid-gravity correspondence – while part four deals with more advanced topics that are still in development, including Fermi and non-Fermi liquids, the quantum Hall effect and non-standard statistics.

Aimed at graduate students, this book assumes a good knowledge of quantum field theory and solid-state physics, as well as familiarity with general relativity.

The post String Theory Methods for Condensed Matter Physics appeared first on CERN Courier.

]]>
Review This book provides an introduction to various methods developed in string theory to tackle problems in condensed-matter physics. https://cerncourier.com/wp-content/uploads/2018/02/CCboo3_02_18.jpg
The Standard Theory of Particle Physics: Essays to Celebrate CERN’s 60th Anniversary https://cerncourier.com/a/the-standard-theory-of-particle-physics-essays-to-celebrate-cerns-60th-anniversary/ Fri, 16 Feb 2018 16:01:00 +0000 https://preview-courier.web.cern.ch/?p=101426 Carlos Lourenço reviews in 2018 The Standard Theory of Particle Physics: Essays to Celebrate CERN’s 60th Anniversary.

The post The Standard Theory of Particle Physics: Essays to Celebrate CERN’s 60th Anniversary appeared first on CERN Courier.

]]>
By Luciano Maiani and Luigi Rolandi (eds.)
World Scientific

Also available at the CERN bookshop

CCboo2_02_18

This book is a collection of articles dedicated to topics within the field of Standard Model physics, authored by some of the main players in both its theory and experimental development. It is edited by Luciano Maiani and Luigi Rolandi, two well-known figures in high-energy physics.

The volume has 21 chapters, most of them devoted to very specific subjects. The first chapters take the reader through a fascinating tour of the history of the field, starting from the earliest days, around the time when CERN was established. I particularly enjoyed reading some recollections of Gerard ’t Hooft, such as: “Asymptotic freedom was discovered three times before 1973 (when Politzer, Gross and Wilczek published their results), but not recognised as a new discovery. This is just one of those cases of miscommunication. The ‘experts’ were so sure that asymptotic freedom was impossible, that signals to the contrary were not heard, let alone believed. In turn, when I did the calculation, I found it difficult to believe that the result was still not known.”

In chapter three, K Ellis reviews the evolution of our understanding of quantum chromodynamics (QCD) and deep-inelastic scattering. Among many things, he shows how the beta function depends on the strong coupling constant, αS, and explains why many perturbative calculations can be made in QCD, when the interactions take place at high-enough energies. At the hadronic scale, however, αS is too large and the perturbative expansion tool no longer works, so alternative methods have to be used. Many non-perturbative effects can be studied with the lattice QCD approach, which is addressed in chapter five. The experimental status regarding αS is reviewed in the following chapter, where G Dissertori shows the remarkable progress in measurement precision (with LHC values reaching per-cent level uncertainties and covering an unprecedented energy range), and how the data is in excellent agreement with the theoretical expectations.

Through the other chapters we can find a large diversity of topics, including a review of global fits of electroweak observables, presently aimed at probing the internal consistency of the Standard Model and constraining its possible extensions given the measured masses of the Higgs boson and of the top quark. Two chapters focus specifically on the W-boson and top-quark masses. Also discussed in detail are flavour physics, rare decays, neutrino masses and oscillations, as is the production of W and Z bosons, in particular in a chapter by M Mangano.

The Higgs boson is featured in many pages: after a chapter by J Ellis, M Gaillard and D Nanopoulos covering its history (and pre-history), its experimental discovery and the measurement of its properties fill two further chapters. An impressive amount of information is condensed in these pages, which are packed with many numbers and (multi-panel) figures. Unfortunately, the figures are printed in black and white (with only two exceptions), which severely affects the clarity of many of them. A book of this importance deserved a more colourful destiny.

The editors make a good point in claiming the time has come to upgrade the Standard Model into the “Standard Theory” of particle physics, and I think this book deserves a place in the bookshelves of a broad community, from the scientists and engineers who contributed to the progress of high-energy physics to younger physicists, eager to learn and enjoy the corresponding inside stories.

The post The Standard Theory of Particle Physics: Essays to Celebrate CERN’s 60th Anniversary appeared first on CERN Courier.

]]>
Review Carlos Lourenço reviews in 2018 The Standard Theory of Particle Physics: Essays to Celebrate CERN’s 60th Anniversary. https://cerncourier.com/wp-content/uploads/2018/02/CCboo2_02_18.jpg
Relativity Matters: From Einstein’s EMC2 to Laser Particle Acceleration and Quark-Gluon Plasma https://cerncourier.com/a/relativity-matters-from-einsteins-emc2-to-laser-particle-acceleration-and-quark-gluon-plasma/ Fri, 16 Feb 2018 16:01:00 +0000 https://preview-courier.web.cern.ch/?p=101427 Torleif Ericson reviews in 2018 Relativity Matters: From Einstein’s EMC2 to Laser Particle Acceleration and Quark-Gluon Plasma,

The post Relativity Matters: From Einstein’s EMC2 to Laser Particle Acceleration and Quark-Gluon Plasma appeared first on CERN Courier.

]]>
By Johann Rafelski
Springer

Also available at the CERN bookshop

CCboo1_02_18

This monograph on special relativity (SR) is presented in a form accessible to a broad readership, from pre-university level to undergraduate and graduate students. At the same time, it will also be of great interest to professional physicists.

Relativity Matters has all the hallmarks of becoming a classic with further editions, and appears to have no counterpart in the literature. It is particularly useful because at present SR has become a basic part not only of particle and space physics, but also of many other branches of physics and technology, such as lasers. The book has 29 chapters organised in 11 parts, which cover topics from the basics of four-vectors, space–time, Lorentz transformations, mass, energy and momentum, to particle collisions and decay, the motion of charged particles, covariance and dynamics.

The first half of the book derives basic consequences of the SR assumptions with a minimum of mathematical tools. It concentrates on the explanation of apparently paradoxical results, presenting and refuting counterarguments as well as debunking various incorrect statements in elementary textbooks. This is done by cleverly exploiting the Galilean method of a dialogue between a professor, his assistant and a student, to bring out questions and objections.

The importance of correctly analysing the consequences for extended and accelerating bodies is clearly presented. Among the many “paradoxes”, one notes the accelerating rocket problem that the late John Bell used to tease many of the world’s most prominent physicists with. Few of them provided a perfectly satisfactory answer.

The second half of the book, starting from part VII, covers the usual textbook material and techniques at graduate level, illustrated with examples from the research frontier. The introductions to the various chapters and subsections are still enjoyable for a broader readership, requiring little mathematics. The author does not avoid technicalities such as vector and matrix algebra and symmetries, but keeps them to a minimum. However, in the parts dealing with electromagnetism, the reader is assumed to be reasonably familiar with Maxwell’s equations.

There are copious concrete exercises and solutions. Throughout the book, indeed, every chapter is complemented by a rich variety of problems that are fully worked out. These are often used to illustrate quantitatively intriguing topics, from space travel to the laser acceleration of charged particles.

An interesting afterword concluding the book discusses how very strong acceleration becomes a modern limiting frontier, beyond which SR in classical physics becomes invalid. The magnitude of the critical accelerations and critical electric and magnetic fields are qualitatively discussed. It also briefly analyses attempts by well-known physicists to side-step the problems that arise as a consequence.

Relativity Matters is excellent as an undergraduate and graduate textbook, and should be a useful reference for professional physicists and technical engineers. The many non-specialist sections will also be enjoyed by the general, science-interested public.Torleif Ericson, CERN

The post Relativity Matters: From Einstein’s EMC2 to Laser Particle Acceleration and Quark-Gluon Plasma appeared first on CERN Courier.

]]>
Review Torleif Ericson reviews in 2018 Relativity Matters: From Einstein’s EMC2 to Laser Particle Acceleration and Quark-Gluon Plasma, https://cerncourier.com/wp-content/uploads/2018/02/CCboo1_02_18.jpg
Exact Solutions in Three-Dimensional Gravity https://cerncourier.com/a/exact-solutions-in-three-dimensional-gravity/ Mon, 15 Jan 2018 16:12:51 +0000 https://preview-courier.web.cern.ch/?p=101438 As stated by the author himself, this book is the result of many years of work and has the purpose of providing a comprehensive, but concise, account of exact solutions in three-dimensional (or 2+1) Einstein gravity.

The post Exact Solutions in Three-Dimensional Gravity appeared first on CERN Courier.

]]>
By Alberto A García-Díaz
Cambridge University Press
xact Solutions in Three-Dimensional Gravity

As stated by the author himself, this book is the result of many years of work and has the purpose of providing a comprehensive, but concise, account of exact solutions in three-dimensional (or 2+1) Einstein gravity. It presents the theoretical frameworks and the general physical and geometrical characteristics of each class of solutions, and includes information about the researchers who discovered or studied them.

These solutions are identified and ordered on the basis of their geometrical invariant properties, their symmetries and their algebraic classifications, or according to their physical nature. They are also examined from different perspectives.

Emphasis is given to solutions to the Einstein equation in the presence of matter and fields, such as: point particle solutions, perfect fluids, dilatons, inflatons and cosmological space-times.

The second part of the book discusses solutions to vacuum topologically massive gravity with a cosmological constant.

Overall, this text serves as a thorough catalogue of exact solutions in (2+1) Einstein gravity and is a very valuable resource for graduate students, as well as researchers in gravitational physics.

The post Exact Solutions in Three-Dimensional Gravity appeared first on CERN Courier.

]]>
Review As stated by the author himself, this book is the result of many years of work and has the purpose of providing a comprehensive, but concise, account of exact solutions in three-dimensional (or 2+1) Einstein gravity. https://cerncourier.com/wp-content/uploads/2022/06/31ZGLyzgFtL._SX328_BO1204203200_.jpg
Mosquitoes https://cerncourier.com/a/mosquitoes/ Mon, 15 Jan 2018 16:12:51 +0000 https://preview-courier.web.cern.ch/?p=101439 Mack Grenfell reviews in 2018 Mosquitoes.

The post Mosquitoes appeared first on CERN Courier.

]]>
by Lucy Kirkwood
National Theatre, London 18 July–28 September 2017
Mosquitoes photo

Lucy Kirkwood’s play Mosquitoes is an ambitious piece of theatre. It combines the telling of an eclectic family drama with the asking of a variety of questions ranging from personal relationships to the remit of science. Mosquitoes tells the story of CERN scientist Alice (Olivia Williams), and the fractious relationship she has with her sister Jenny (Olivia Colman). After working for 11 years at CERN on the French–Swiss border, Alice is visited by Jenny just as work on discovering the Higgs boson is nearing its peak. Conflict between Jenny and Alice’s challenged son, Luke (Joseph Quinn), drives much of the plot. Domestic scenes between these three characters are interspersed with glimpses of Luke’s absent father, who momentarily turns the theatre into a planetarium while waxing lyrical over the science which the play is set against.

The spectacle of these brief moments is a highlight of the play; contrasting wonderfully with the often mundane lives of the characters. Kirkwood also makes a poignant contrast between the characters’ personal and professional lives. Alice, despite exuding a certain confidence in her professional life as a scientist, often struggles to relate personally to those around her. Chief amongst those is her son Luke who, despite showing the occasional interest in his mother’s work, is ultimately critical of it for a number or reasons. He questions the environmental impact of what she is doing, believing that the LHC poses existential risks. He also frequently bemoans his mother’s commitment to her work, which he believes comes at the expense of himself. Through the play, it becomes apparent that Luke and his mother previously lived in the UK, and that he was made to follow her to Switzerland, but he would like to go back to England.

These personal relationships are played out in front of the sisters’ ailing mother Karen (Amanda Boxer). A former physicist herself now suffering from dementia, Karen frequently laments missing out on her chances at winning a Nobel Prize. Karen’s character, who provides the audience with a glimpse of her daughter Alice’s future, adds a sense of futility to Alice’s work.

Overall, Mosquitoes – the title coming from a line of dialogue in which protons smashing in the Large Hadron Collider are compared to mosquitoes hitting each other head on – is a stunning piece of work. Not just for the way it weaves together story lines to explore a range of complex questions, but also for the immensely high quality of acting talent which it boasts. This is bettered only by the faultless light, sound, and set design, which complement each other perfectly during the play’s most dramatic moments.

The post Mosquitoes appeared first on CERN Courier.

]]>
Review Mack Grenfell reviews in 2018 Mosquitoes. https://cerncourier.com/wp-content/uploads/2018/01/CCboo5_01_18.jpg
Fermilab at 50 https://cerncourier.com/a/fermilab-at-50/ Mon, 15 Jan 2018 16:12:50 +0000 https://preview-courier.web.cern.ch/?p=101436 The short essays received on the 50th anniversary of Fermilab have been collected in this commemorative book.

The post Fermilab at 50 appeared first on CERN Courier.

]]>
By Swapan Chattopadhyay and Joseph David Lykken (eds.)
World Scientific
Fermilab at 50

On the occasion of the 50th anniversary of its foundation, the management of Fermilab asked leading scientists and supporters, whose careers and life paths crossed at the US laboratory, to share their memories and thoughts about its past, present and future. The short essays received have been collected in this commemorative book.

Among the many prestigious contributors are Nobel laureates T D Lee, Burton Richter and Jack Steinberger; in addition to present and former Fermilab directors (Nigel Lockyer, Piermaria Oddone and John Peoples); present and former CERN Directors-General (Fabiola Gianotti and Rolf Heuer), as well as many other important physicists, scientific leaders and even politicians and businessmen.

Through the recollections of the authors, key events in Fermilab’s history are brought to life. The milestone of 50 years of research are also retraced in a rich photo gallery.

While celebrating its glorious past, Fermilab is also looking towards its future, as highlighted in the book. Many experiments are ongoing, or planned at the laboratory and its scientific programme includes research on neutrinos; accelerator science; quantum computing; dark matter and the cosmic background radiation, as well as a continuous participation in the LHC physics, especially in the CMS experiment.

A light read, this book will appeal to all the scientists who at some point in their career stepped on the floor of Fermilab. It will also appeal to those readers who are interested in discovering more about the history of the laboratory through the records of the people who participated in it, whether it was directly or indirectly.

The post Fermilab at 50 appeared first on CERN Courier.

]]>
Review The short essays received on the 50th anniversary of Fermilab have been collected in this commemorative book. https://cerncourier.com/wp-content/uploads/2018/01/CCboo2_01_18.jpg
Loop Quantum Gravity: The First 30 Years https://cerncourier.com/a/loop-quantum-gravity-the-first-30-years/ Mon, 15 Jan 2018 16:12:50 +0000 https://preview-courier.web.cern.ch/?p=101437 This book, which is part of the “100 Years of General Relativity” series of monographs, aims to provide an overview of the foundations and recent developments of loop quantum gravity.

The post Loop Quantum Gravity: The First 30 Years appeared first on CERN Courier.

]]>
By Abhay Ashtekar and Jorge Pullin (eds.)
World Scientific
Loop Quantum Gravity: The First 30 Years

This book, which is part of the “100 Years of General Relativity” series of monographs, aims to provide an overview of the foundations and recent developments of loop quantum gravity (LQG).

This is a theory that merges quantum mechanics and general relativity in an effort to unify gravity with the other three fundamental forces. In the approach of LQG, space–time is not a continuum, but it is quantised, and is considered as a dynamic entity. Different from string theory, loop quantum gravity is a “background-independent” theory, which aims to explain space and time instead of being plugged into an already existing space–time structure.

The book comprises eight chapters, distributed in three parts. The first is a general introduction that sets the scene and anticipates what will be discussed in detail in the following sections. The second part, comprising five chapters, introduces the conceptual, mathematical and physical foundation of LQG. In part three, the application of this theory to cosmology and black holes is discussed, also introducing predictions that might be testable in the foreseeable future.

Written by young theoretical physicists who are expert in the field, this volume is meant both to provide an introduction to the field and to offer a review of the latest developments, not discussed in many other existing books, for senior researchers. It will also appeal to scientists who do not work directly on LQG but are interested in issues at the interface of general relativity and quantum physics.

The post Loop Quantum Gravity: The First 30 Years appeared first on CERN Courier.

]]>
Review This book, which is part of the “100 Years of General Relativity” series of monographs, aims to provide an overview of the foundations and recent developments of loop quantum gravity. https://cerncourier.com/wp-content/uploads/2018/01/CCboo3_01_18.jpg
I am the Smartest Man I Know: A Nobel Laureate’s Difficult Journey https://cerncourier.com/a/i-am-the-smartest-man-i-know-a-nobel-laureates-difficult-journey/ Mon, 15 Jan 2018 16:12:49 +0000 https://preview-courier.web.cern.ch/?p=101435 Christine Sutton reviews in 2018 I am The Smartest Man I Know.

The post I am the Smartest Man I Know: A Nobel Laureate’s Difficult Journey appeared first on CERN Courier.

]]>
By Ivar Giaever
World Scientific
I am the Smartest Man I Know: A Nobel Laureate

At the end of his last semester studying mechanical engineering at the Norwegian Institute of Technology, Ivar Giaever gained a grade of 3.5 for a thesis on the efficiency of refrigeration machines – just a little better than the 4.0 needed to pass. The thesis had been hastily written as the machines worked badly, and he and his friend had had little time to collect their data. But they both scraped through and, as Giaever writes, “maybe sometimes life is a little bit fair after all?”.

It’s a reference to the opening words of his light-hearted autobiography: “Life is not fair, and I, for one, am happy about that.” The title sounds provocative, but

the book is a reflection on how life’s little twists and turns can have extremely important consequences.

Giaever calls this “luck” and admits that he has had more than his share of it – from relatively humble beginnings in Norway to a Nobel prize and beyond.

In many respects Giaever had been a “bad” student. Good at cards, billiards, chess – and drinking – he had little interest in mechanical engineering. He finished with a grade of 4.0 in both physics and mathematics; but had at least married Inger, his long-time sweetheart.

His first job was at the patent office in Oslo, but apartments were hard to find, so the couple decided to emigrate to Canada. A few twists led Giaever to General Electric (GE), where he had the chance to study again through the company’s “A, B and C” courses.

This second chance to learn proved pivotal. Seeing how the studies related to GE’s production of generators, motors and such like, made learning exciting, and Giaever graduated as the best student on the A course. But GE in Canada offered only the A course and, eager to learn more, he moved to GE’s Research Laboratory in Schenectady in the US.

There he completed the B and C courses, and also began studying for a master’s degree in physics at the Rensselaer Polytechnic Institute (RPI). He was to remain with GE for the next 30 years, after being offered a permanent job, even though he did not yet have a PhD.

As a fully-fledged member of the research lab, Giaever needed a project. John Fisher proposed that he look into quantum mechanical tunnelling between thin films, which Giaever went on to do with great success in 1959.

Then, during his studies at RPI, he learned about the new Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity, which predicted the appearance of a forbidden energy gap near the Fermi level when a metal becomes superconducting. Giaever realised that he could measure this gap using his tunnelling apparatus, and so provide crucial verification of the BCS theory. He also realised that tunnelling between two superconductors with different energy gaps would produce a negative resistance, and could allow for active devices such as amplifiers. He worried that if he talked about his work, others would realise this before he had done the relevant experiment.

To his surprise nobody did, hence his comment to his family: “I am the smartest man I know!”. His children thought he was being big-headed, but in 1973 the whole family went with him to Stockholm when he was rewarded with a share of the Nobel Prize in Physics in 1973 for his work on tunnelling in superconductors.

Giaever, of course, covers much more of his life story in this book. There is little technical detail, but a plethora of anecdotes that provide fascinating insight into a person who has made the most of his life.

Two impressions stand out: he is lucky to have found in Inger a partner with whom he has been able to share his long life; and he is lucky to have had a second chance to study and discover that he is smarter than many people thought.

The post I am the Smartest Man I Know: A Nobel Laureate’s Difficult Journey appeared first on CERN Courier.

]]>
Review Christine Sutton reviews in 2018 I am The Smartest Man I Know. https://cerncourier.com/wp-content/uploads/2018/01/CCboo1_01_18.jpg
The Physical World: An Inspirational Tour of Fundamental Physics https://cerncourier.com/a/the-physical-world-an-inspirational-tour-of-fundamental-physics/ Fri, 10 Nov 2017 16:28:49 +0000 https://preview-courier.web.cern.ch/?p=101449 Ranging from classical to quantum mechanics, from nuclear to particle physics and cosmology, this book aims to provide an overview of various branches of physics in both a comprehensive and concise fashion.

The post The Physical World: An Inspirational Tour of Fundamental Physics appeared first on CERN Courier.

]]>
By Nicholas Manton and Nicholas Mee
Oxford University Press
CCboo3_10_17

Ranging from classical to quantum mechanics, from nuclear to particle physics and cosmology, this book aims to provide an overview of various branches of physics in both a comprehensive and concise fashion. As the authors state, their objective is to offer an inspirational tour of fundamental physics that is accessible to readers with a high-school background in physics and mathematics, and to motivate them to delve deeper into the topics covered.

Key equations are presented and their solutions derived, ensuring that each step is clear. Emphasis is also placed on the use of variational principles in physics.

After introducing some basic ideas and tools in the first chapter, the book presents Newtonian dynamics and the application of Newton’s law of gravitation to the motion of bodies in the solar system. Chapter 3 deals with the electromagnetic field and Maxwell’s equations. From classical physics, the authors jump to Einstein’s revolutionary theory of special relativity and the concept of space–time. Chapters 5 and 6 are devoted to curved space, general relativity and its consequences, including the existence of black holes. The other revolutionary idea of the 20th century, quantum mechanics, is discussed in chapters 7 and 8, while chapter 9 applies this theory to the structure and properties of materials, and explains the fundamental principles of chemistry and solid-state physics. Chapter 10 covers thermodynamics, built on the concepts of temperature and entropy, and gives special attention to the analysis of black-body radiation. After an overview of nuclear physics (chapter 11), chapter 12 presents particle physics, including a short description of quantum field theory, the Standard Model with the Higgs mechanism and the recent discovery of its related boson. Chapters 13 and 14 are about astrophysics and cosmology, while the final chapter discusses some of the fundamental problems that remain open.

The post The Physical World: An Inspirational Tour of Fundamental Physics appeared first on CERN Courier.

]]>
Review Ranging from classical to quantum mechanics, from nuclear to particle physics and cosmology, this book aims to provide an overview of various branches of physics in both a comprehensive and concise fashion. https://cerncourier.com/wp-content/uploads/2017/11/CCboo3_10_17.jpg
The Cosmic Cocktail: Three Parts Dark Matter https://cerncourier.com/a/the-cosmic-cocktail-three-parts-dark-matter/ Fri, 10 Nov 2017 16:28:49 +0000 https://preview-courier.web.cern.ch/?p=101450 Ruth Durrer reviews in 2017 The Cosmic Cocktail: Three Parts Dark Matter.

The post The Cosmic Cocktail: Three Parts Dark Matter appeared first on CERN Courier.

]]>
By Katherine Freese
Princeton University Press

Also available at the CERN bookshop

CCboo2_10_17

This book by Katherine Freese, now out in paperback, is aimed at non-professionals interested in dark matter. The hypothesis that the matter in galaxy clusters is dominated by a non-luminous component, and hence is dark, goes back to a paper published in 1933 by the Swiss astronomer Fritz Zwicky, who also coined the term “dark matter”. But it has only been during the last 20 years or so that we have realised that the matter in the universe is dominated by dark matter and that most of it is non-baryonic, i.e. not made of the stuff that makes up all the other matter we know.

The author explains the observational evidence for dark matter and its relevance for cosmology and particle physics, both in a formal scientific context and also based on her personal adventures as a researcher in this field. I especially enjoyed her detailed, well-informed discussion and evaluation of present dark-matter searches.

The book is structured in nine chapters. The first is a personal introduction, followed by a historical account of the growing evidence for dark matter. Chapter 3 discusses our present understanding of the expanding universe, explaining how much of what we know is due to the very accurate observations of the cosmic microwave background. This is followed by a chapter on Big Bang nucleosynthesis, describing how the first elements beyond hydrogen (deuterium, helium-3, lithium and especially helium-4) were formed in the early universe. In the fifth chapter, the plethora of dark-matter candidates – ranging from axions to WIMPS and primordial black holes – are presented. Chapter 6 is devoted to the LHC at CERN: its four experiments are briefly described and the discovery of the Higgs is recounted. Chapters 6 and 7 are at the heart of the author’s own research (the author is a dark-matter theorist and not heavily involved in any particular dark-matter experiments). They discuss the experiments that can be undertaken to detect dark matter, either directly or indirectly or via accelerator experiments. An insightful and impartial discussion of present experiments with tentative positive detections is presented in chapter 8. The final chapter is devoted to dark energy, responsible for the accelerated expansion of the universe. Is it a cosmological constant or vacuum energy with a value that is many orders of magnitude smaller than what we would expect from quantum field theory? Is it a dynamical field or does the beautiful theory of general relativity break down at very large distances?

Even though in some places inaccuracies have slipped in, most explanations are rigorous yet non-technical. In addition to the fascinating subject, the book contains a lot of interesting personal and historical remarks (many of them from the first- or second-hand experience of the author), which are presented in an enthusiastic and funny style. They are one of the characteristics that make this book not only an interesting source of information but also a very enjoyable read.

As a female scientist myself, I appreciated the way the author acknowledges the work of women in science. She presents a picture of a field of research that has been shaped by many brilliant female scientists, starting from Vera Rubin’s investigations of galaxy rotation curves and ending with Elena Aprile’s and Laura Baudis’ lead in the most advanced direct dark-matter searches. It seems to need a woman to do justice to our outstanding female colleagues.

The fact that less than three years after the first publication of the book some cosmological parameters have shifted and some information about recent experiments is already outdated only tells us that dark matter is a hot topic of very active research. I sincerely hope that the author’s gut feeling is correct and the discovery of dark matter is just around the corner.

The post The Cosmic Cocktail: Three Parts Dark Matter appeared first on CERN Courier.

]]>
Review Ruth Durrer reviews in 2017 The Cosmic Cocktail: Three Parts Dark Matter. https://cerncourier.com/wp-content/uploads/2017/11/CCboo2_10_17.jpg
The Photomultiplier Handbook https://cerncourier.com/a/the-photomultiplier-handbook/ Fri, 10 Nov 2017 16:28:49 +0000 https://preview-courier.web.cern.ch/?p=101451 This volume is a comprehensive handbook aimed primarily at those who use, design or build vacuum photomultipliers.

The post The Photomultiplier Handbook appeared first on CERN Courier.

]]>
By A G Wright
Oxford University Press
CCboo4_10_17

This volume is a comprehensive handbook aimed primarily at those who use, design or build vacuum photomultipliers. Drawing on his 40 years of experience as a user and manufacturer, the author wrote it to fill perceived gaps in the existing literature.

Photomultiplier tubes (PMTs) are extremely sensitive light detectors, which multiply the current produced by incident photons by up to 100 million times. Since their invention in the 1930s they have seen huge developments that have increased their performance significantly. PMTs have been and still are extensively applied in physics experiments and their evolution has been shaped by the requirements of the scientific community.

The first group of chapters sets the scene, introducing light-detection techniques and discussing in detail photocathodes – important components of PMTs – and optical interfaces. Since light generation and detection are statistical processes, detectors providing electron multiplication are also considered statistical in their operation. As a consequence, a chapter is dedicated to some theory of statistical processes, which is important to choose, use or design PMTs. The second part of the book deals with all of the important parameters that determine the performance of a PMT, each analysed thoroughly: gain, noise, background, collection and counting efficiency, dynamic range and timing. The effects of environmental conditions on performance are also discussed. The last part is devoted to instrumentation, in particular voltage dividers and electronics for PMTs.

Each chapter concludes with a summary and a comprehensive set of references. Three appendices provide additional useful information.

The book could become a valuable reference for researchers and engineers, and for students working with light sensors and, in particular, photomultipliers.

The post The Photomultiplier Handbook appeared first on CERN Courier.

]]>
Review This volume is a comprehensive handbook aimed primarily at those who use, design or build vacuum photomultipliers. https://cerncourier.com/wp-content/uploads/2017/11/CCboo4_10_17.jpg
The Lazy Universe: An Introduction to the Principle of Least Action https://cerncourier.com/a/the-lazy-universe-an-introduction-to-the-principle-of-least-action/ Fri, 10 Nov 2017 16:28:47 +0000 https://preview-courier.web.cern.ch/?p=101448 Andrea Giammanco reviews in 2017 The Lazy Universe: An Introduction to the Principle of Least Action.

The post The Lazy Universe: An Introduction to the Principle of Least Action appeared first on CERN Courier.

]]>
By Jennifer Coopersmith
Oxford University Press
CCboo1_10_17

With contagious enthusiasm and a sense of humour unusual in this kind of literature, this book by Jennifer Coopersmith deals with the principle of least action or, to be more rigorous, of stationary action. As the author states, this principle defines the tendency of any physical system to seek out the “flattest” region of “space” – with appropriate definitions of the concepts of flatness and space. This is certainly not among the best-known laws of nature, despite its ubiquity in physics and having survived the advent of several scientific revolutions, including special and general relativity and quantum mechanics. The author makes a convincing case for D’Alembert’s principle (as it is often called) as a more insightful and conceptually fertile basis to understand classical mechanics than Newton’s laws. As she points out, Newton and D’Alembert asked very different questions, and in many cases variational mechanics, inspired by the latter, is more natural and insightful than working in Newton’s absolute space, but it can also feel like using a sledgehammer to crack a peanut.

The book starts with a general and very accessible introduction to the principle of least action. Then follows a long and interesting description of the developments that led to the principle as we know it today. The second half of the book delves into Lagrangian and Hamiltonian mechanics, while the final chapter illustrates the relevance of the principle for modern (non-classical) physics, although this theme is also touched upon several times in the preceding chapters.

An important caveat is that this is not a textbook: it should be seen as complementary to, rather than a replacement for, a standard introduction to the topic. For example, the Euler–Lagrange equation is presented but not derived and, in general, mathematical formulae are kept to a bare minimum in the main text. Coopersmith compensates for this with several thorough appendices, which range from classical textbook-like examples to original derivations. She makes a convincing critique of a famous argument by Landau and Lifshitz to demonstrate the dependence of kinetic energy on the square of the speed, and in one of the appendices she develops an interesting alternative explanation.

Although the author pays a lot of credit to The Variational Principles of Mechanics by Cornelius Lanczos (written in 1949 and re-edited in 1970), hers is a very different kind of book aimed at a different public. Moreover, the author has developed several original and insightful analogies. For example, she remarks upon how smartphones know their orientation: instead of measuring positions and angles with respect to external (absolute) space, three accelerometers in the phone measure tiny motions in three directions of the local gravity field. This is reminiscent of the methods of variational mechanics.

Notations are coherent throughout the book and clearly explained, and footnotes are used wisely. With an unusual convention that is never made explicit, the author graphically warns the reader when a footnote is witty or humorous, or potentially perceived as far-fetched, by putting the text in parenthesis.

My main criticism concerns the frequent references to distant chapters, which entangle the logical flow. This is a book made for re-reading and, as a result, it might be difficult to follow for readers with little previous knowledge of the topic. Moreover, I was rather baffled by the author’s confession (repeated twice) that she was unable to find a quote by Feynman that she is sure to have read in his Lectures. Nevertheless, these minor flaws do not diminish my general appreciation for Coopersmith’s very useful and well-written book.

The first part is excellent reading for anybody with an interest in the history and philosophy of science. I also recommend the book to students in physics and mathematics who are willing to dig deeper into this subject after taking classes in analytical mechanics, and I believe that it is accessible to any student in STEM disciplines. Practitioners in physics from any sub-discipline will enjoy a refresh and a different point of view that puts their tools of the trade in a broader context.

The post The Lazy Universe: An Introduction to the Principle of Least Action appeared first on CERN Courier.

]]>
Review Andrea Giammanco reviews in 2017 The Lazy Universe: An Introduction to the Principle of Least Action. https://cerncourier.com/wp-content/uploads/2017/11/CCboo1_10_17.jpg
Reaching out from the European school https://cerncourier.com/a/reaching-out-from-the-european-school/ Fri, 10 Nov 2017 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/reaching-out-from-the-european-school/ The CERN–JINR European School of High-Energy Physics marks 25 years of teaching advanced topics in particle physics.

The post Reaching out from the European school appeared first on CERN Courier.

]]>

Training and education have been among CERN’s core activities since the laboratory was founded. The CERN Convention of 1954 stated that these activities might include “promotion of contacts between, and interchange of, scientists…and the provision of advanced training for research workers”. It was in this spirit that the first residential schools of physics were organised by CERN in the early 1960s. Initially held in Switzerland, with a duration of one week, the schools soon evolved into two-week events that took place annually and rotated among CERN Member States.

Following discussions between the Directors-General of CERN and the Joint Institute for Nuclear Research (JINR) in Russia, it was agreed that CERN should organise the 1970 school in collaboration with JINR. The event was held in Finland, which at that time was not a Member State of either institution, and the CERN–JINR collaboration evolved into today’s annual CERN–JINR European Schools of High-Energy Physics (HEP). The European schools that began in 1993 (CERN Courier June 2013 p27) are held in a CERN Member State three years out of four, and in a JINR Member State one year out of four.

The target audience of the European schools is advanced PhD students in experimental HEP, preparing them for a career as research physicists. Around 100 students attend each event following a rigorous selection process. Those attending the 2017 school – the 25th in the series, held from 6 to 19 September in Évora, Portugal – were selected from more than 230 candidates, taking into account their potential to pursue a research career in experimental particle physics. The 100 successful students included 33 different nationalities and, reflecting an increasing trend over the past quarter century of the European schools, about a third were women.

The core programme of the schools continues to be particle-physics theory and phenomenology, including general topics such as the Standard Model, quantum chromodynamics and flavour physics, complemented by more specialised aspects such as heavy-ion physics, Higgs physics, neutrino physics and physics beyond the Standard Model. A course on practical statistics reflects the importance of this topic in modern HEP data analysis. The school also includes classes on cosmology, in light of the strong link between particle physics and astrophysical dark-matter research. Students are taught about the latest developments and prospects at CERN’s Large Hadron Collider (LHC). They also hear from the Director-General of CERN and the director of JINR about the programmes and plans of the two organisations, which have links going back more than half a century. Thus, in addition to studying a wide spectrum of physics topics, the students are given a broad overview and outlook on particle-physics facilities and related issues.

The two-week residential programme includes a total of more than 30 plenary lectures of 90 minutes each, complemented by parallel discussion sessions involving six groups of about 17 students. Each group remains with the same discussion leader for the duration of the school, providing an environment where the students are comfortable to ask questions about the lectures and explore topics of interest in greater depth. The students are encouraged to discuss their own research work with each other and with the staff of the school during an after-dinner poster session. The lecturers are highly experienced experts in their fields, coming from many different countries in Europe and beyond, while the discussion leaders are highly active, but sometimes less-senior physicists.

New ingredient

A new ingredient in the school’s programme since 2014 is training in outreach for the general public. Making use of two 90 minute teaching slots, the students learn about communicating science to a general audience from two professional trainers who have a background in journalism with the BBC. The compulsory training sessions are complemented by optional one-on-one exercises that are very popular with the students. The exercises involve acting out a radio interview about a discovery of new physics at the LHC based on a fictitious scenario.

Building on what they have learnt in the science-communication training, the students from each discussion group collaborate in their “free time” to prepare an eight-minute talk on a particle-physics topic at a level understandable to the public. This is an exercise in teamwork as well as in outreach. The group needs to identify the specific aspects of the topic that they are going to address, develop a plan to make it interesting and relevant to a general audience, share the work of preparing the presentation between the team members, and agree who will give the talk on their behalf. The results of the collaborative group projects are presented in an after-dinner session that is video recorded. A jury made up of experienced science communicators judges the projects and gives feedback to each group. The topics addressed in the projects at the 2017 school in Portugal included the Standard Model, neutrinos, extra dimensions, and cosmology, with the prize for the best team effort going to a presentation on the Higgs boson illustrated with a “cookie-eating grandmother” field.

Equipping young researchers with good science-communication skills is considered important by the management of both CERN and JINR, and outreach training is greatly appreciated by most of the European school’s students. As a follow up, students are encouraged to make contact with the people responsible for outreach in their experimental collaborations or home institutes, with a view to participating in science-communication activities.

In addition to the outreach training, important public events are often held in the host country at the time of the school – benefitting from the presence of the leading scientists who are lecturing.  This is well illustrated by the 2017 edition, at which a public event at Évora University coincided with visits to the school by CERN Director-General Fabiola Gianotti, who gave a talk entitled “The Higgs particle and our life”, and JINR director Victor Matveev. The event was attended by numerous high-level representatives of Portuguese scientific institutes and universities, and also by the Portuguese minister of science, technology and higher education, Manuel Heitor. There was an audience of about 300, including high-school teachers, pupils and university students, with more following a live webcast.

Branching out

In addition to the annual schools that take place in Europe, CERN is involved in organising schools of HEP in Latin America (in odd-numbered years since 2001) and in the Asia-Pacific region (in even-numbered years since 2012). These schools have a similar core programme to the European ones, but with more emphasis on instrumentation and experimental techniques. This reflects the fact that there are fewer opportunities in some of the countries concerned for advanced training in these areas.

Although there is so far no specific teaching at the schools in Latin America and the Asia-Pacific region on communicating science to a general audience, education and outreach activities are often arranged in the host country around the time of the schools. For example, an important education and outreach programme was organised to coincide with the 2017 CERN–Latin-American School held from 8 to 21 March in Querétaro, Mexico. Here, several teachers from the CERN school gave short lecture courses or seminars to undergraduate students from Universidad Autónoma de Querétaro and the Juriquilla campus of Universidad Nacional Autónoma de México.

A highlight of the outreach programme in Mexico was a large public event on 8 March, the arrivals day for students at the CERN school and, by coincidence, International Women’s Day. This included introductory talks by Fabiola Gianotti (recorded in advance and subtitled in Spanish) and by Julia Tagüeña Parga (in person), deputy director for scientific development in the Mexican national science and technology agency, CONACyT. These were followed by a lecture entitled “Einstein, black holes and gravitational waves” by Gabriela Gonzalez, spokesperson of the LIGO collaboration, attracting a capacity audience of about 400 people.

As is evident, the European schools of HEP have a long history and continue their primary mission of teaching HEP and related topics to young researchers. However, the programme continues to evolve, and it now includes some training in science communication that is becoming increasingly important in the CERN and JINR Member States. The success of the schools can be judged by an anonymous evaluation questionnaire in which the overall assessment is overwhelmingly positive, with about 60% of students in 2014–2017 giving the highest ranking of “excellent”.

In total, more than 3000 students have attended the schools, including the Latin-American schools since 2001 and the Asia–Europe–Pacific schools since 2012, as well as the European schools since 1993. All these schools are important ingredients in delivering CERN’s mission in education and outreach, and in supporting its policies of international co-operation and being open to geographical enlargement within and beyond Europe. They bring together participants and teachers of many different nationalities, and each school requires close collaboration between CERN, co-organisers such as JINR for the European schools, and colleagues from the host country. The schools may also link in with other aspects of CERN’s international relations. For example, the 2015 Latin-American school in Ecuador helped to pave the way for formal membership of Ecuadorian universities in the CMS experiment. Similarly, the 2011 European school and associated outreach activities in Bucharest marked steps towards Romania becoming a Member State of CERN.

The next European school will be held in Maratea, Italy, from 20 June to 3 July 2018, followed by an Asia–Europe–Pacific school in Quy Nhon, Vietnam, from 12 to 25 September 2018.

The post Reaching out from the European school appeared first on CERN Courier.

]]>
Feature The CERN–JINR European School of High-Energy Physics marks 25 years of teaching advanced topics in particle physics. https://cerncourier.com/wp-content/uploads/2018/06/CCeur1_10_17.jpg
Foundations of Nuclear and Particle Physics https://cerncourier.com/a/foundations-of-nuclear-and-particle-physics/ Fri, 13 Oct 2017 17:03:39 +0000 https://preview-courier.web.cern.ch/?p=101459 This textbook aims to present the foundations of both nuclear and particle physics in a single volume in a balanced way, and to highlight the interconnections between them.

The post Foundations of Nuclear and Particle Physics appeared first on CERN Courier.

]]>
By T W Donnelly, J A Formaggio, B R Holstein, R G Milner and B Surrow
Cambridge University Press

6151cZHCADS

This textbook aims to present the foundations of both nuclear and particle physics in a single volume in a balanced way, and to highlight the interconnections between them. The material is organised from a “bottom-up” point of view, moving from the fundamental particles of the Standard Model to hadrons and finally to few- and many-body nuclei built from these hadronic constituents.

The first group of chapters introduces the symmetries of the Standard Model. The structure of the proton, neutron and nuclei in terms of fundamental quarks and gluons is then presented. A lot of space is devoted to the processes used experimentally to unravel the structure of hadrons and to probe quantum chromodynamics, with particular focus on lepton scattering. Following the treatment of two-nucleon systems and few-body nuclei, which have mass numbers below five, the authors discuss the properties of many-body nuclei, and also extend the treatment of lepton scattering to include the weak interactions of leptons with nucleons and nuclei. The last group of chapters is dedicated to relativistic heavy-ion physics and nuclear and particle astrophysics. A brief perspective on physics beyond the Standard Model is also provided.

The volume includes approximately 120 exercises and is completed by two appendices collecting values of important constants, useful equations and a brief summary of quantum theory.

The post Foundations of Nuclear and Particle Physics appeared first on CERN Courier.

]]>
Review This textbook aims to present the foundations of both nuclear and particle physics in a single volume in a balanced way, and to highlight the interconnections between them. https://cerncourier.com/wp-content/uploads/2022/06/6151cZHCADS.jpg
The Grant Writer’s Handbook: How to Write a Research Proposal and Succeed https://cerncourier.com/a/the-grant-writers-handbook-how-to-write-a-research-proposal-and-succeed/ Fri, 13 Oct 2017 17:03:39 +0000 https://preview-courier.web.cern.ch/?p=101460 This book is designed as a “how to” guide to writing grant proposals for competitive peer review.

The post The Grant Writer’s Handbook: How to Write a Research Proposal and Succeed appeared first on CERN Courier.

]]>
By Gerard M Crawley and Eoin O’Sullivan
Imperial College Press

grant-writer-s-handbook-the-how-to-write-a-research-proposal-and-succeed

This book is designed as a “how to” guide to writing grant proposals for competitive peer review. Nowadays researchers are often required to apply to funding agencies to secure a budget for their work, but being a good researcher does not necessarily imply being able to write a successful grant proposal. Typically, the additional skills and insights needed are learnt through experience.

This timely book aims to guide researchers through the whole process, from conceiving the initial research idea, defining a project and drafting a proposal, through to the review process and responding to reviewers’ comments. Drawing on their own experience as reviewers in a number of different countries, the authors provide many important tips to help researchers communicate both the quality of their research and their ability to carry it out and manage a grant. The authors illustrate their guidelines with the help of many examples of both successful and unsuccessful grant applications, and emphasise key messages with quotes from reviewers.

The book also contains valuable advice for primary investigators on how to set up their research budget, manage people and lead their project. Two appendices at the end of the volume provide website addresses and references, as well as an outline of how to organise a grant competition.

Aimed primarily at early career researchers applying for their first grant, the book will also be beneficial to more experienced scientists, to the administrators of universities and institutions that support their researchers during the submission process, and to the staff of recently established funding organisations, who may have little experience in organising peer-review competitions.

The post The Grant Writer’s Handbook: How to Write a Research Proposal and Succeed appeared first on CERN Courier.

]]>
Review This book is designed as a “how to” guide to writing grant proposals for competitive peer review. https://cerncourier.com/wp-content/uploads/2022/06/grant-writer-s-handbook-the-how-to-write-a-research-proposal-and-succeed.jpg
ITER Physics https://cerncourier.com/a/iter-physics/ Fri, 13 Oct 2017 17:03:38 +0000 https://preview-courier.web.cern.ch/?p=101457 This 235 page book is dedicated to the ITER tokamak, the deuterium–tritium fusion reactor under construction in France, which aims to investigate the feasibility of fusion power.

The post ITER Physics appeared first on CERN Courier.

]]>
By C Wendell Horton Jr and Sadruddin Benkadda
World Scientific
CCboo2_09_17

This 235 page book is dedicated to the ITER tokamak, the deuterium–tritium fusion reactor under construction in France, which aims to investigate the feasibility of fusion power. The book provides a concise overview of the state-of-the-art plasma physics involved in nuclear-fusion processes. Definitely not an introductory book – not even for a plasma-physics graduate student – it would be useful as a reference text for experts. Across 10 chapters, the authors describe the physics learned from previous tokamak projects around the world and the application of that experience to ITER.

After an introduction to the ITER project, the conventional magneto-hydrodynamic description of plasma physics is discussed, with strong emphasis on the geometry of the divertor (located at the bottom of the vacuum vessel to extract heat and reduce contamination of the plasma from impurities). Chapter 3 deals with the problem of alpha-particle distribution, which is a source of Alfven and cyclotron instabilities. Edge localised mode (ELM) instabilities associated with the divertor’s magnetic separatrix are also discussed. Conditions of turbulent transport are assumed throughout, so chapter 4 provides a general review of our (mainly experimental) knowledge of the topic. Chapters 5 and 6 are specific to the ITER design because they describe the ELM instabilities in the ITER tokamak and the solutions adopted for their control. Concluding the part dedicated to the fusion-reactor transient phase, steady-state operations and plasma diagnostics techniques are described in chapters 7 and 8, respectively.

The tokamak’s complex magnetic field is able to confine charged particles in the fusion plasma but not neutral particles. Neutron bombardment of surfaces can be viewed as an inconvenience, making it necessary to ensure the walls are radiation hard, or an advantage, turning the surfaces into a breeding blanket to generate further tritium fuel. Radiation hardness of the tokamak walls is discussed in chapter 9, while chapter 10 explains how ITER will transmute a lithium blanket into tritium via bombardment with fusion neutrons. The IFMIF (International Fusion Materials Irradiation Facility) project, conceived for fusion-material tests and still in its final design phase, is also briefly presented. The book closes with some predictions about the expectations to be fulfilled by ITER, before proceeding to the design of DEMO – a future tokamak for electrical-energy production.

In summary, ITER Physics is a book for expert scientists who are looking for a compact overview of the latest advances in tokamak physics. I appreciated the exhaustive set of references at the end of each chapter, since it provides a way to go deeper into concepts not exhaustively explained in the book. Plasma-fusion physics is complex, not only because it is a many-body problem but also because our knowledge in this field is limited, as the authors stress. I would have appreciated more graphic material in some parts: in order to fully understand how a fusion reactor works, one has to think in 3D, so schematics are always helpful.

The post ITER Physics appeared first on CERN Courier.

]]>
Review This 235 page book is dedicated to the ITER tokamak, the deuterium–tritium fusion reactor under construction in France, which aims to investigate the feasibility of fusion power. https://cerncourier.com/wp-content/uploads/2017/10/CCboo2_09_17.jpg
Relativistic Kinetic Theory, with Applications in Astrophysics and Cosmology https://cerncourier.com/a/relativistic-kinetic-theory-with-applications-in-astrophysics-and-cosmology/ Fri, 13 Oct 2017 17:03:38 +0000 https://preview-courier.web.cern.ch/?p=101458 This book provides an overview of relativistic kinetic theory, from its theoretical foundations to its applications, passing through the various numerical methods used when analytical solutions of complex equations cannot be obtained.

The post Relativistic Kinetic Theory, with Applications in Astrophysics and Cosmology appeared first on CERN Courier.

]]>
By Gregory V Vereshchagin and Alexey G Aksenov
Cambridge University Press

41R8f6QUjUL

This book provides an overview of relativistic kinetic theory, from its theoretical foundations to its applications, passing through the various numerical methods used when analytical solutions of complex equations cannot be obtained.

Kinematic theory (KT) was born in the 19th century and aims to derive the properties of macroscopic matter from the properties of its constituent microscopic particles. The formulation of KT within special relativity was completed in the 1960s.

Relativistic KT has traditional applications in astrophysics and cosmology, two fields that tend to rely on observations rather than experiments. But it is now becoming more accessible to direct tests due to recent progress in ultra-intense lasers and inertial fusion, generating growing interest in KT in recent years.

The book has three parts. The first deals with the fundamental equations and methods of the theory, starting with the evolution of the basic concept of KT from nonrelativistic to special and general relativistic frameworks. The second part gives an introduction to computational physics and describes the main numerical methods used in relativistic KT. In the third part, a range of applications of relativistic KT are presented, including wave dispersion and thermalisation of relativistic plasma, kinetics of self-gravitating systems, cosmological structure formation, and neutrino emission during gravitational collapse.

Written by two experts in the field, the book is intended for students who are already familiar with both special and general relativity and with quantum electrodynamics.

The post Relativistic Kinetic Theory, with Applications in Astrophysics and Cosmology appeared first on CERN Courier.

]]>
Review This book provides an overview of relativistic kinetic theory, from its theoretical foundations to its applications, passing through the various numerical methods used when analytical solutions of complex equations cannot be obtained. https://cerncourier.com/wp-content/uploads/2022/06/41R8f6QUjUL.jpg
Radioactivity and Radiation: What They Are, What They Do, and How to Harness Them https://cerncourier.com/a/radioactivity-and-radiation-what-they-are-what-they-do-and-how-to-harness-them/ Fri, 13 Oct 2017 17:03:37 +0000 https://preview-courier.web.cern.ch/?p=101456 Federico Ravotti reviews in 2017 Radioactivity and Radiation: What They Are, What They Do, and How to Harness Them.

The post Radioactivity and Radiation: What They Are, What They Do, and How to Harness Them appeared first on CERN Courier.

]]>
By Claus Grupen and Mark Rodgers
Springer International Publishing
CCboo1_09_17

Have you ever thought that batteries capable of providing energy over very long periods could be made with radioisotopes? Did you know that the bacterium deinococcus radiodurans can survive enormous radiation doses and, thanks to its ability to chemically alter highly radioactive waste, it could be potentially employed to clean up radioactively contaminated areas? And do you believe that cockroaches have an extremely high radiation tolerance? Apparently, the latter is a myth. These are a few of the curiosities contained in this “all that you always wanted to know about radioactivity” book from Grupen and Rodgers. It gives a comprehensive overview of the world of radioactivity and radiation, from its history to its risks for humans.

The book begins by laying the groundwork with essential, but quite detailed (similar to a school textbook), information about the structure of matter, how radiation is generated, how it interacts with matter and how it can be measured. In the following chapters, the book explores the substantial benefits of radioactivity through its many applications (not only positive, but also negative and sometimes questionable) and the possible risks associated with its use. The authors deal mainly with ionising radiation; however, in view of the public debate about other kinds of radiation (such as mobile-phone and microwave signals), they include a brief chapter on non-ionising radiation. Also interesting are the final sections, provided as appendices, which summarise the main technologies of radiation detectors as well as the fundamental principles of radiation protection. In the latter, the rationale behind current international rules and regulations, put in place to avoid excessive radiation exposure for radiation workers and the general public, is clearly explained.

This extensive topic is covered using easily understood terms and only elementary mathematics is employed to describe the essentials of complex nuclear-physics phenomena. This makes for pleasant reading intended for the general public interested in radioactivity and radiation, but also for science enthusiasts and inquisitive minds. As a bonus, the book is illustrated with eye-catching cartoons, most of them drawn by one of the authors.

The book emphasises that radiation is everywhere and that almost everything around us is radioactive to some degree: there is natural radioactivity in our homes, in the food that we eat and the air that we breathe. Radiation from the natural environment does not present a hazard; however, radiation levels higher than the naturally occurring background can be harmful to both people and the environment. These artificially increased radiation levels are mainly due to the nuclear industry and have therefore risen substantially since the beginning of the civil-nuclear age in the 1950s. This approach helps readers to put things in perspective and allows them to compare the numbers and specific measurement quantities that are used in the radiation-protection arena. These quantities are the same used by the media, for instance, to address the general public when a radiation incident occurs.

Not only will this book enrich the reader’s knowledge about radioactivity and radiation, it will also provide them with tools to better understand many of the related scientific issues. Such comprehension is crucial for anyone who is willing to develop their own point of view and be active in public debates on the topic.

The post Radioactivity and Radiation: What They Are, What They Do, and How to Harness Them appeared first on CERN Courier.

]]>
Review Federico Ravotti reviews in 2017 Radioactivity and Radiation: What They Are, What They Do, and How to Harness Them. https://cerncourier.com/wp-content/uploads/2017/10/CCboo1_09_17.jpg