diff --git "a/raw_rss_feeds/https___physicsworld_com_feed_.xml" "b/raw_rss_feeds/https___physicsworld_com_feed_.xml"
--- "a/raw_rss_feeds/https___physicsworld_com_feed_.xml"
+++ "b/raw_rss_feeds/https___physicsworld_com_feed_.xml"
@@ -14,9 +14,9 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/"
The post Can we compare Donald Trump’s health chief to Soviet science boss Trofim Lysenko? appeared first on Physics World.
+]]> +Born in 1898, Lysenko was a Ukrainian plant breeder, who in 1927 found he could make pea and grain plants develop at different rates by applying the right temperatures to their seeds. The Soviet news organ Pravda was enthusiastic, saying his discovery could make crops grow in winter, turn barren fields green, feed starving cattle and end famine.
+Despite having trained as a horticulturist, Lysenko rejected the then-emerging science of genetics in favour of Lamarckism, according to which organisms can pass on inherited traits to offspring. This meshed well with the Soviet philosophy of “dialectical materialism”, which sees both the natural and human worlds as evolving not through mechanisms but environment.
+Stalin took note of Lysenko’s activities and had him installed as head of key Soviet science agencies. Once in power, Lysenko dismissed scientists who opposed his views, cancelled their meetings, funded studies of discredited theories, and stocked committees with loyalists. Although Lysenko had lost his influence by the time Stalin died in 1953 – with even Pravda having turned against him – Soviet agricultural science had been destroyed.
+Lysenko’s views and actions have a resonance today when considering the activities of Robert F Kennedy Jr, who was appointed by Donald Trump as secretary of the US Department of Health and Human Services in February 2025. Of course, Trump has repeatedly sought to impose his own agenda on US science, with his destructive impact outlined in a detailed report published by the Union of Concerned Scientists in July 2025.
+Last May Trump signed executive order 14303, “Restoring Gold Standard Science”, which blasts scientists for not acting “in the best interests of the public”. He has withdrawn the US from the World Health Organization (WHO), ordered that Federal-sponsored research fund his own priorities, redefined the hazards of global warming, and cancelled the US National Climate Assessment (NSA), which had been running since 2000.
+ +But after Trump appointed Kennedy, the assault on science continued into US medicine, health and human services. In what might be called a philosophy of “political materialism”, Kennedy fired all 17 members of the Advisory Committee on Immunization Practices of the US Centers for Disease Control and Prevention (CDC), cancelled nearly $500m in mRNA vaccine contracts, hired a vaccine sceptic to study its connection with autism despite numerous studies that show no connection, and ordered the CDC to revise its website to reflect his own views on the cause of autism.
+In his 2021 book The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health, Kennedy promotes not germ theory but what he calls “miasma theory”, according to which diseases are prevented by nutrition and lifestyle.
+Of course, there are fundamental differences between the 1930s Soviet Union and the 2020s United States. Stalin murdered and imprisoned his opponents, while the US administration only defunds and fires them. Stalin and Lysenko were not voted in, while Trump came democratically to power, with elected representatives confirming Kennedy. Kennedy has also apologized for his most inflammatory remarks, though Stalin and Lysenko never did (nor does Trump for that matter).
+What’s more, Stalin’s and Lysenko’s actions were more grounded in apparent scientific realities and social vision than Trump’s or Kennedy’s. Stalin substantially built up much of the Soviet science and technology infrastructure, whose dramatic successes include launching the first Earth satellite Sputnik in 1957. Though it strains credulity to praise Stalin, his vision to expand Soviet agricultural production during a famine was at least plausible and its intention could be portrayed as humanitarian. Lysenko was a scientist, Kennedy is not.
+As for Lysenko, his findings seemed to carry on those of his scientific predecessors. Experimentally, he expanded the work of Russian botanist Ivan Michurin, who bred new kinds of plants able to grow in different regions. Theoretically, his work connected not only with dialectical materialism but also with that of the French naturalist Jean-Baptiste Lamarck, who claimed that acquired traits can be inherited.
+Trump and Kennedy are off-the-wall by comparison. Trump has called climate change a con job and hoax seeks to stop research that says otherwise. In 2019 he falsely stated that Hurricane Dorian was predicted to hit Alabama, then ordered the National Oceanic and Atmospheric Administration to issue a statement supporting him. Trump has said he wants the US birth rate to rise and that he will be the “fertilization president”, but later fired fertility and IVF researchers at the CDC.
+As for Kennedy, he has said that COVID-19 “is targeted to attack Caucasians and Black people” and that Ashkenazi Jews and Chinese are the most immune (he disputed the remark, but it’s on video). He has also sought to retract a 2025 vaccine study from the Annals of Internal Medicine (178 1369) that directly refuted his views on autism.
+US Presidents often have pet scientific projects. Harry Truman created the National Science Foundation, Dwight D Eisenhower set up NASA, John F Kennedy started the Apollo programme, while Richard Nixon launched the Environmental Protection Agency (EPA) and the War on Cancer. But it’s one thing to support science that might promote a political agenda and another to quash science that will not.
+One ought to be able to take comfort in the fact that if you fight nature, you lose – except that the rest of us lose as well. Thanks to Lysenko’s actions, the Soviet Union lost millions of tons of grain and hundreds of herds of cattle. The promise of his work evaporated and Stalin’s dreams vanished.
+Lysenko, at least, was motivated by seeming scientific promise and social vision; the US has none. Trump has damaged the most important US scientific agencies, destroyed databases and eliminated the EPA’s research arm, while Kennedy has replaced health advisory committees with party loyalists.
+While Kennedy may not last his term – most Trump Cabinet officials don’t – the paths he has sent science policy on surely will. For Trump and Kennedy, the policy seems to consist only of supporting pet projects. Meanwhile, cases of measles in the US have reached their highest level in three decades, the seas continue to rise and the climate is changing. It is hard to imagine how enemy agents could damage US science more effectively.
+The post Can we compare Donald Trump’s health chief to Soviet science boss Trofim Lysenko? appeared first on Physics World.
+]]>The post Diagnosing brain cancer without a biopsy appeared first on Physics World.
+]]>At the heart of the approach is a black phosphorus (BP)–engineered surface plasmon resonance (SPR) interface. An ultrathin BP layer is deposited on a gold-coated fiber tip. Because of the work-function difference between BP and gold, electrons transfer from BP into the Au film, creating a strongly enhanced local electric field at the metal–semiconductor interface. This BP–Au charge-transfer nano-interface amplifies refractive-index changes at the surface far more efficiently than conventional metal-only SPR chips, enabling the detection of molecular interactions that would otherwise be too subtle to resolve and pushing the limit of detection down to 21 attomolar without nucleic-acid amplification. The BP layer also provides a high-area, biocompatible surface for immobilizing RNA reporters.
+To achieve sequence specificity, the researchers integrated CRISPR-Cas13a, an RNA-guided nuclease that becomes catalytically active only when its target sequence is perfectly matched to a designed CRISPR RNA (crRNA). When the target microRNA (miR-21) is present, activated Cas13a cleaves RNA reporters attached to the BP-modified fiber surface, releasing gold nanoparticles and reducing the local refractive index. The resulting optical shift is read out in real time through the SPR response of the BP-enhanced fiber probe, providing single-nucleotide-resolved detection directly on the plasmonic interface.
+With this combined strategy, the sensor achieved a limit of detection of 21 attomolar in buffer and successfully distinguished single-base-mismatched microRNAs. In tests on aqueous-humor samples from patients with PCNSL, the CRISPR-BP-FOSPR assay produced results that closely matched clinical qPCR data, despite operating without any amplification steps.
+Because aqueous-humor aspiration is a minimally invasive ophthalmic procedure, this BP-driven plasmonic platform may offer a practical route for early PCNSL screening, longitudinal monitoring, and potentially the diagnosis of other neurological diseases reflected in eye-fluid biomarkers. More broadly, the work showcases how black-phosphorus-based charge-transfer interfaces can be used to engineer next-generation, fibre-integrated biosensors that combine extreme sensitivity with molecular precision.
+Yanqi Ge et al 2025 Rep. Prog. Phys. 88 070502
++
Theoretical and computational tools to model multistable gene regulatory networks by Federico Bocci, Dongya Jia, Qing Nie, Mohit Kumar Jolly and José Onuchic (2023)
+The post Diagnosing brain cancer without a biopsy appeared first on Physics World.
+]]>The post 5f electrons and the mystery of δ-plutonium appeared first on Physics World.
+]]>The delta (δ) phase is perhaps the most interesting allotrope of plutonium. δ-plutonium is technologically important, has a very simple crystal structure, but its electronic structure has been debated for decades. Researchers have attempted to understand its anomalous behaviour and how the properties of δ-plutonium are connected to the 5f electrons.
+The 5f electrons are found in the actinide group of elements which includes plutonium. Their behaviour is counterintuitive. They are sensitive to temperature, pressure and composition, and behave in both a localised manner, staying close to the nucleus and in a delocalised (itinerant) manner, more spread out and contributing to bonding. Both these states can support magnetism depending on actinide element. The 5f electrons contribute to δ-phase stability, anomalies in the material’s volume and bulk modulus, and to a negative thermal expansion where the δ-phase reduces in size when heated.
+
In this work, the researchers present a comprehensive model to predict the thermodynamic behaviour of δ-plutonium, which has a face-centred cubic structure. They use density functional theory, a computational technique that explores the overall electron density of the system and incorporate relativistic effects to capture the behaviour of fast-moving electrons and complex magnetic interactions. The model includes a parameter-free orbital polarization mechanism to account for orbital-orbital interactions, and incorporates anharmonic lattice vibrations and magnetic fluctuations, both transverse and longitudinal modes, driven by temperature-induced excitations. Importantly, it is shown that negative thermal expansion results from magnetic fluctuations.
+This is the first model to integrate electronic effects, magnetic fluctuations, and lattice vibrations into a cohesive framework that aligns with experimental observations and semi-empirical models such as CALPHAD. It also accounts for fluctuating states beyond the ground state and explains how gallium composition influences thermal expansion. Additionally, the model captures the positive thermal expansion behaviour of the high-temperature epsilon phase, offering new insight into plutonium’s complex thermodynamics.
+First principles free energy model with dynamic magnetism for δ-plutonium
+Per Söderlind et al 2025 Rep. Prog. Phys. 88 078001
++
Pu 5f population: the case for n = 5.0 J G Tobin and M F Beaux II (2025)
+The post 5f electrons and the mystery of δ-plutonium appeared first on Physics World.
+]]>The post Scientists explain why ‘seeding’ clouds with silver iodide is so efficient appeared first on Physics World.
+]]>“Silver iodide has been used in atmospheric weather modification programs around the world for several decades,” explains Jan Balajka from TU Wien’s Institute of Applied Physics, who led this research. “In fact, it was chosen for this purpose as far back as the 1940s because of its atomic crystal structure, which is nearly identical to that of ice – it has the same hexagonal symmetry and very similar distances between atoms in its lattice structure.”
+ +The basic idea, Balajka continues, originated with the 20th-century American atmospheric scientist Bernard Vonnegut, who suggested in 1947 that introducing small silver iodide (AgI) crystals into a cloud could provide nuclei for ice to grow on. But while Vonnegut’s proposal worked (and helped to inspire his brother Kurt’s novel Cat’s Cradle), this simple picture is not entirely accurate. The stumbling block is that nucleation occurs at the surface of a crystal, not inside it, and the atomic structure of an AgI surface differs significantly from its interior.
+To investigate further, Balajka and colleagues used high-resolution atomic force microscopy (AFM) and advanced computer simulations to study the atomic structure of 2‒3 nm diameter AgI crystals when they are broken into two pieces. The team’s measurements revealed that the surfaces of both freshly cleaved structures differed from those found inside the crystal.
+More specifically, team member Johanna Hütner, who performed the experiments, explains that when an AgI crystal is cleaved, the silver atoms end up on one side while the iodine atoms appear on the other. This has implications for ice growth, because while the silver side maintains a hexagonal arrangement that provides an ideal template for the growth of ice layers, the iodine side reconstructs into a rectangular pattern that no longer lattice-matches the hexagonal symmetry of ice crystals. The iodine side is therefore incompatible with the epitaxial growth of hexagonal ice.
+“Our works solves this decades-long controversy of the surface vs bulk structure of AgI, and shows that structural compatibility does matter,” Balajka says.
+According to Balajka, the team’s experiments were far from easy. Many experimental methods for studying the structure and properties of material surfaces are based on interactions with charged particles such as electrons or ions, but AgI is an electrical insulator, which “excludes most of the tools available,” he explains. Using AFM enabled them to overcome this problem, he adds, because this technique detects interatomic forces between a sharp tip and the surface and does not require a conductive sample.
+Another problem is that AgI is photosensitive and decomposes when exposed to visible light. While this property is useful in other contexts – AgI was a common ingredient in early photographic plates – it created complications for the TU Wien team. “Conventional AFM setups make use of optical laser detection to map the topography of a sample,” Balajka notes.
+To avoid destroying their sample while studying it, the researchers therefore had to use a non-contact AFM based on a piezoelectric sensor that detects electrical signals and does not require optical readout. They also adapted their setup to operate in near-darkness, using only red light while manipulating the Ag to ensure that stray light did not degrade the samples.
+ +The computational modelling part of the work introduced yet another hurdle to overcome. “Both Ag and I are atoms with a high number of electrons in their electron shells and are thus highly polarizable,” Balajka explains. “The interaction between such atoms cannot be accurately described by standard computational modelling methods such as density functional theory (DFT), so we had to employ highly accurate random-phase approximation (RPA) calculations to obtain reliable results.”
+The researchers acknowledge that their study, which is detailed in Science Advances, was conducted under highly controlled conditions – ultrahigh vacuum, low pressure and temperature and a dark environment – that are very different from those that prevail inside real clouds. “The next logical step for us is therefore to confirm whether our findings hold under more representative conditions,” Balajka says. “We would like to find out whether the structure of AgI surfaces is the same in air and water, and if not, why.”
+The researchers would also like to better understand the atomic arrangement of the rectangular reconstruction of the iodine surface. “This would complete the picture for the use of AgI in ice nucleation, as well as our understanding of AgI as a material overall,” Balajka says.
+The post Scientists explain why ‘seeding’ clouds with silver iodide is so efficient appeared first on Physics World.
+]]>At the heart of the QSC programme sits ORNL’s leading-edge research infrastructure for classical HPC, a capability that includes Frontier, the first supercomputer to break the exascale barrier and still one of the world’s most powerful. On that foundation, QSC is committed to building QHPC architectures that take advantage of both quantum computers and exascale supercomputing to tackle all manner of scientific and industrial problems beyond the reach of today’s HPC systems alone.
“Hybrid classical-quantum computing systems are the future,” says Humble. “With quantum computers connecting both physically and logically to existing HPC systems, we can forge a scalable path to integrate quantum technologies into our scientific infrastructure.”
-

Industry partnerships are especially important in this regard. Working in collaboration with the likes of IonQ, Infleqtion and QuEra, QSC scientists are translating a range of computationally intensive scientific problems – quantum simulations of exotic matter, for example – onto the vendors’ quantum computing platforms, generating excellent results out the other side.
“With our broad representation of industry partners,” notes Humble, “we will establish a common framework by which scientific end-users, software developers and hardware architects can collaboratively advance these tightly coupled, scalable hybrid computing systems.”
It’s a co-development model that industry values greatly. “Reciprocity is key,” Humble adds. “At QSC, we get to validate that QHPC can address real-world research problems, while our industry partners gather user feedback to inform the ongoing design and optimization of their quantum hardware and software.”
@@ -191,7 +322,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/"Listen to the Physics World podcast: Oak Ridge’s Quantum Science Center takes a multidisciplinary approach to developing quantum materials and technologies


With an acknowledged shortage of skilled workers across the quantum supply chain, QSC is doing its bit to bolster the scientific and industrial workforce. Front-and-centre: the fifth annual QSC Summer School, which was held at Purdue University in April this year, hosting 130 graduate students (the largest cohort to date) through an intensive four-day training programme.
The Summer School sits as part of a long-term QSC initiative to equip ambitious individuals with the specialist domain knowledge and skills needed to thrive in a quantum sector brimming with opportunity – whether that’s in scientific research or out in industry with hardware companies, software companies or, ultimately, the end-users of quantum technologies in key verticals like pharmaceuticals, finance and healthcare.
@@ -5615,191 +5746,5 @@ To learn about calendar aging challenges in next generation Si based Li-ion battThe post Bayes’ rule goes quantum appeared first on Physics World.
-]]>Bayes’ rule is named after Thomas Bayes who first defined it for conditional probabilities in “An Essay Towards Solving a Problem in the Doctrine of Chances” in 1763. It describes the probability of an event based on prior knowledge of conditions that might be related to the event. One area in which it is routinely used is to update beliefs based on new evidence (data). In classical statistics, the rule can be derived from the principle of minimum change, meaning that the updated beliefs must be consistent with the new data while only minimally deviating from the previous belief.
-In mathematical terms, the principle of minimum change minimizes the distance between the joint probability distributions of the initial and updated belief. Simply put, this is the idea that for any new piece of information, beliefs are updated in the smallest possible way that is compatible with the new facts. For example, when a person tests positive for Covid-19, they may have suspected that they were ill, but the new information confirms this. Bayes’ rule is a therefore way to calculate the probability of having contracted Covid-19 based not only on the test result, and the chance of the test yielding a false negative, but also on the patient’s initial suspicions.
-Quantum versions of Bayes’ rule have been around for decades, but the approach through the minimum change principle had not been tried before. In the new work, a team led by Ge Bai, Francesco Buscemi and Valerio Scarani set out to do just that.
- -“We found which quantum Bayes’ rule is singled out when one maximizes the fidelity (which is equivalent to minimizing the change) between two processes,” explains Bai. “In many cases, the solution is the ‘Petz recovery map’, proposed by Dénes Petz in the 1980s and which was already considered as being one of the best candidates for the quantum Bayes’ rule. It is based on the rules of information processing, crucial not only for human reasoning, but also for machine learning models that update their parameters with new data.”
-Quantum theory is counter-intuitive, and the mathematics is hard, says Bai. “Our work provides a mathematically sound way to update knowledge about a quantum system, rigorously derived from simple principles of reasoning, he tells Physics World. “It demonstrates that the mathematical description of a quantum system—the density matrix—is not just a predictive tool, but is genuinely useful for representing our understanding of an underlying system. “It effectively extends the concept of gaining knowledge, which mathematically corresponds to a change in probabilities, into the quantum realm.”
-The “simple principles of reasoning” encompass the minimum change principle, adds Buscemi. “The idea is that while new data should lead us to update our opinion or belief about something, the change should be as small as possible, given the data received.
-“It’s a conservative stance of sorts: I’m willing to change my mind, but only by the amount necessary to accept the hard facts presented to me, no more.”
-“This is the simple (yet powerful) principle that Ge mentioned,” he says, “and it guides scientific inference by preventing unwanted biases from entering the reasoning process.”
-While several quantum versions of the Bayes’ rule have been put forward before now, these were mostly based on the fact of having analogous properties to their classical counterpart, adds Scarani. “Recently, Francesco and one co-author proposed an axiomatic approach to the most frequently-used quantum Bayes rule, the one using the Petz recovery map. Our work is the first to derive a quantum Bayes rule from an optimization principle, which works very generally for classical information, but which has been used here for the first time in quantum information.
- -The result is very intriguing, he says: “we recover the Petz map in many cases, but not all. If we take that our new approach is the correct way to define a quantum Bayes rule, then previous constructions based on analogies were correct very often, but not quite always; and one or more of the axioms are not to be enforced after all. Our work is therefore is a major advance, but it is not the end of the road – and this is nice.”
-Indeed, the researchers say they are now busy further refining their quantum Bayes’ rule. They are also looking into applications for it. “Beyond machine learning, this rule could be powerful for inference—not just for predicting the future but also retrodicting the past,” says Bai. “This is directly applicable to problems in quantum communication, where one must recover encoded messages, and in quantum tomography, where the goal is to infer a system’s internal state from observations.
-“We will be using our results to develop new, hopefully more efficient, and mathematically well-founded methods for these tasks,” he concludes.
-The present study is detailed in Physical Review Letters.
-The post Bayes’ rule goes quantum appeared first on Physics World.
-]]>The post The top five physics Nobel prizes of the 21st century revealed appeared first on Physics World.
-]]>Quantum physics is our hot favourite this time round – it’s the International Year of Quantum Science and Technology and the Nobel Committee for Physics aren’t immune to wider events. But whoever wins, you know that the prize will have been very carefully considered by committee members.
-Over the 125 years since the prize was first awarded, almost every seminal finding in physics has been honoured – from the discovery of the electron, neutrino and positron to the development of quantum mechanics and the observation of high-temperature superconductivity.
-But what have been the most significant physics prizes of the 21st century? I’m including 2000 as part of this century (ignoring pedants who say it didn’t start till 1 January 2001). During that time, the Nobel Prize for Physics has been awarded 25 times and gone to 68 different people, averaging out at about 2.7 people per prize.
-Now, my choice is entirely subjective, but I reckon the most signficant prizes are those that:
-So with that in mind, here’s my pick of the five top physics Nobel prizes of the 21st century. You’ll probably disagree violently with my choice so e-mail us with your thoughts.
-
Coming in at number five in our list of top physics Nobels of the 21st century is the discovery of neutrino oscillation, which went to Takaaki Kajita and Art McDonald in 2015. The neutrino was first hypothesized by Wolfgang Pauli back in 1930 as “a desperate remedy” to the fact that energy didn’t seem to be conserved when a nucleus emits an electron via beta decay. Fred Reines and Clyde Cowan had won a Nobel prize in 1995 for the original discovery of neutrinos themselves, which are chargeless particles that interact with matter via the weak force and are fiendishly hard to detect.
- -But what Kajita (at the Super-Kamikande experiment in Japan) and McDonald (at the Sudbury Neutrino Observatory in Canada) had done is to see them switch, or “oscillate”, from one type to another. Their work proved that these particles, which physicists had assumed to be massless, do have mass after all. This was at odds with the Standard Model of particle physics – and isn’t it fun when physics upends conventional wisdom?
-What’s more, the discovery of neutrino oscillation explained why Ray Davies and John Bahcall had seen only a third of the solar neutrinos predicted by theory in their famous experiment of 1964. This discrepancy arose because solar neutrinos are oscillating between flavours as they travel to the Earth – and their experiment had detected only a third as it was sensitive mainly to electron neutrinos, not the other types.
-
At number four in our list of the best physics Nobel prizes of the 21st century is the 2001 award, which went to Eric Cornell, Wolfgang Ketterle and Carl Wieman for creating the first Bose–Einstein condensates (BECs). I love the idea that Cornell and Wieman created a new state of matter – in which particles are locked together in their lowest quantum state – at exactly 10.54 a.m. on Monday 5 June 1995 at the JILA laboratory in Boulder, Colorado.
-First envisaged by Satyendra Nath Bose and Albert Einstein in 1924, Cornell and Wieman created the first BEC by cooling 2000 rubidium-87 atoms to 170nK using the then new techniques of laser and evaporative cooling. Within a few months, Wolfgang Ketterle over at the Massachusetts Institute of Technology also made a BEC from 500,000 sodium-23 atoms at 2 μK.
-Since then hundreds of groups around the world have created BECs, which have been used for everything from slowing light to making “atom lasers” and even modelling the behaviour of black holes. Moreover, the interactions between the atoms can be finely controlled, meaning BECs can be used to simulate properties of condensed-matter systems that are extremely difficult – or impossible – to probe in real materials.
-
Coming in at number three is the 2013 prize, which went to François Englert and the late Peter Higgs for discovering the mechanism by which subatomic particles get mass. Their work was confirmed in 2012 by the discovery of the so-called Higgs boson at the ATLAS and CMS experiments at CERN’s Large Hadron Collider.
-Higgs and Englert didn’t, of course, win for detecting the Higgs boson, although the Nobel citation credits the ATLAS and CMS teams in its citation. What they were being credited for was work done back in the early 1960s when they published papers independently of each other that provided a mechanism by which particles can have the masses we observe.
- -Higgs had been studying spontaneous symmetry breaking, which led to the notion of massless, force-carrying particles, known as Goldstone bosons. But what Higgs realized was that Goldstone bosons don’t necessarily occur when a symmetry is spontaneously broken – they could be reinterpreted as an additional quantum (polarization) state of a force-carrying particle.
-The leftover terms in the equations represented a massive particle – the Higgs boson – avoiding the need for a massless unobserved particle. Writing in his now-famous 1964 paper (Phys. Rev. Lett. 13 508), Higgs highlighted the possibility of a massive spin-zero boson, which is what was discovered at CERN in 2012.
-That work probably got more media attention than all Nobel prizes this century, because who doesn’t love a huge international collaboration tracking down a particle on the biggest physics experiment of all time? Especially as the Standard Model doesn’t predict what its mass should be so it’s hard to know where to look. But it doesn’t take top slot in my book because it “only” confirmed what we had expected and we’re still on the look-out for “new physics” beyond the Standard Model.
-
Taking second place in our list is the discovery that the expansion of the universe is not slowing down – but accelerating – thanks to studies of exploding stars called supernovae. As with so many Nobel prizes these days, the 2011 award went to three people: Brian Schmidt, who led the High-Z Supernovae Search Team, and his colleague Adam Riess, and to Saul Perlmutter who led the rival Supernova Cosmology Project.
-Theirs was a pretty sensational finding that implied that about three-quarters of the mass–energy content of the universe must consist of some weird, gravitationally repulsive substance, dubbed “dark energy”, about which even now we still know virtually nothing. It had previously been assumed that the universe would – depending on how much matter it contains – either collapse eventually in a big crunch or go on expanding forever, albeit at an ever more gentle pace.
-The teams had been studying type 1a supernovae, which always blow up in the same way when they reach the same mass, which means that they can be used as “standard candles” to accurately measure distance in the universe. Such supernovae are very rare and the two groups had to carry out painstaking surveys using ground-based telescopes and the Hubble Space Telescope to find enough of them.
-The teams thought they’d find that the expansion of the universe is decelerating, but as more and more data piled up, the results only appeared to make sense if the universe has a force pushing matter apart. The Royal Swedish Academy of Sciences said the discovery was “as significant” as the 2006 prize, which had gone to John Mather and the late George Smoot for their discovery in 1992 of the minute temperature variations in the cosmic microwave background – the fossil remnants of the large-scale structures in today’s universe.
-But to me, the accelerating expansion has the edge as the implications are even more profound, pointing as they do to the composition and fate of the cosmos.
-
And finally, the winner of the greatest Nobel Prize for Physics of the 21st century is the 2017 award, which went to Barry Barish, Kip Thorne and the late Rainer Weiss for the discovery of gravitational waves. Not only is it the most recent prize on my list, it’s also memorable for being a genuine first – discovering the “ripples in space–time” originally predicted by Einstein. The two LIGO detectors in Livingston, Louisiana, and Hanford, Washington, are also astonishing feats of engineering, capable of detecting changes in distance tinier than the radius of the proton.
-The story of how gravitational waves were first observed is now well known. It was in the early hours of the morning Monday 14 September 2015, just after staff who had been calibrating the LIGO detector in Livingston had gone to bed, when gravitational waves created from the collision of two black holes 1.3 billion light-years away hit the LIGO detectors in the US. The historic measurement dubbed GW150914 hit the headlines around the world.
- -More than 200 gravitational-wave events have so far been detected – and observing these ripples, which had long been on many physicists’ bucket lists, has over the last decade become almost routine. Most gravitational-wave detections have been binary black-hole mergers, though there have also been a few neutron-star/black-hole collisions and some binary neutron-star mergers too. Gravitational-wave astronomy is now a well-established field not just thanks to LIGO but also Virgo in Italy and KAGRA in Japan. There are also plans for an even more advanced Einstein Telescope, which could detect in a day what it took LIGO a decade to spot.
-Gravitational waves also opened the whole new field of “multimessenger astronomy” – the idea that you observe a cosmic event with gravitational waves and then do follow-up studies using other instruments, measuring it with cosmic rays, neutrinos and photons. Each of these cosmic messengers is produced by distinct processes and so carries information about different mechanisms within its source.
-The messengers also differ widely in how they carry this information to the astronomer: for example, gravitational waves and neutrinos can pass through matter and intergalactic magnetic fields, providing an unobstructed view of the universe at all wavelengths. Combining observations of different messengers will therefore let us see more and look further.
-The post The top five physics Nobel prizes of the 21st century revealed appeared first on Physics World.
-]]>The post ASTRO 2025: expanding the rules of radiation therapy appeared first on Physics World.
-]]>Yashar was speaking at a news briefing arranged to highlight a select few high-impact abstracts. And in accord with the ASTRO 2025 meeting’s theme of “rediscovering radiation medicine and exploring new indications”, the chosen presentations included examples of innovative techniques and less common indications, including radiotherapy treatments of non-malignant disease and a novel combination of external-beam radiation with radioligand therapy.
-Ventricular tachycardia (VT) is a life-threatening heart rhythm disorder that’s usually treated with medication, implantation of a cardiac device and then catheter ablation, an invasive procedure in which a long catheter is inserted via a leg vein into the heart to destroy abnormal cardiac tissue. A research team at Washington University School of Medicine has now shown that stereotactic arrhythmia radiation therapy (STAR) could provide an equally effective and potentially safer treatment alternative.
-
STAR works by delivering precision beams of radiation to the scarred tissue that drives the abnormal heart rhythm, without requiring invasive catheters or anaesthesia.
-“Over the past several years, STAR has emerged as a novel non-invasive treatment for patients with refractory VT,” said Shannon Jiang, who presented the team’s findings at ASTRO. “So far, there have been several single-arm studies showing promising results for STAR, but there are currently no data that directly compare STAR to catheter ablation, and that’s the goal for our study.”
-Jiang and colleagues retrospectively analysed data from 43 patients with recurrent refractory VT (which no longer responds to treatment). Patients were treated with either STAR or repeat catheter ablation at a single institution. The team found that both treatments were similarly effective at controlling arrhythmia, but patients receiving radiation had far fewer serious side effects.
-Within one year of the procedure, eight patients (38%) in the ablation group experienced treatment-related serious adverse events, compared with just two (9%) in the STAR group. These complications occurred sooner after ablation (median six days) than after radiation (10 months). In four cases, patients receiving ablation died within a month of treatment, soon after experiencing an adverse event, and one patient did not survive the procedure. In contrast, in the STAR group, there were no deaths attributed to treatment-related side effects. One year after treatment, overall survival was 73% following radiation and 58% after ablation; at three years (the median follow-up time), it was 45% in both groups.
- -“Despite the fact that this is a retrospective, non-randomized analysis, our study provides some important preliminary data that support the use of STAR as a potentially safer and equally effective treatment option for patients with high-risk refractory VT,” Jiang concluded.
-Commenting on the study, Kenneth Rosenzweig from Icahn School of Medicine at Mount Sinai emphasizes that the vast majority of patients with VT will be well cared for by standard cardiac ablation, but that radiation can help in certain situations. “This study shows that for patients where the ablation just isn’t working anymore, there’s another option. Some patients will really need the help of radiation medicine to get them through, and work like this will help us figure out who those patients are and what we can do to improve their quality-of-life.”
-A clinical trial headed up at the University of California, Los Angeles, has shown that adding radioligand therapy to metastasis-directed radiation therapy more than doubles progression-free survival in men with oligometastatic prostate cancer, without increasing toxicity.
-“When we pair external-beam radiation directed to tumours we can see with a radiopharmaceutical to reach microscopic disease we can’t see, patients can experience a notably longer interval before progression,” explained principal investigator Amar Kishan.
-Patients with oligometastatic prostate cancer (up to five metastases outside the prostate after initial therapy) are increasingly treated with metastasis-directed stereotactic body radiation therapy (SBRT). While this treatment can delay progression and the need for hormone therapy, in most patients the cancer recurs, likely due to the presence of undetectable microscopic disease.
-
Radioligand therapy uses a radiopharmaceutical drug to deliver precise radiation doses directly to tumours. For prostate cancer, the drug combines radioactive isotope lutetium-177 with a ligand that targets the prostate-specific membrane antigen (PSMA) found on cancer cells. Following its promising use in men with advanced prostate cancer, the team examined whether adding radioligand therapy to SBRT could also improve progression-free survival in men with early metastatic disease.
-The phase II LUNAR trial included 92 men with oligometastatic prostate cancer and one to five distant lesions as seen on a PSMA PET/CT scan. The patients were randomized to receive either SBRT alone (control arm) or two cycles of the investigational PSMA-targeting drug 177Lu-PNT2002, eight weeks apart, followed by SBRT.
-At a median follow-up of 22 months, adding radioligand therapy improved median progression-free survival from 7.4 to 17.3 months. Hormone therapy was also delayed, from 14.1 months in the control group to 24.3 months. Of 65 progression events observed, 64 were due to new lesions rather than regrowth at previously treated sites. Both treatments were well tolerated, with no difference in severe side effects between the two groups.
-“We conclude that adding two cycles of 177Lu-PNT2002 to SBRT significantly improves progression-free survival in men with oligorecurrent prostate cancer, presumably by action on occult metastatic disease, without an increase in toxicity,” said Kishan. “Ultimately, while this intervention worked well, 64% of patients even on the investigational arm still had some progression, so we could further optimize the dose and cycle and other variables for these patients.”
-Osteoarthritis is a painful joint disease that arises when the cartilage cushioning the ends of bones wears down. Treatments include pain medication, which can cause significant side effects with long-term use, or invasive joint replacement surgery. Byoung Hyuck Kim from Seoul National University College of Medicine described how low-dose radiotherapy (LDRT) could help bridge this treatment gap.
-
LDRT could provide a non-invasive alternative treatment for knee osteoarthritis, a leading cause of disability, Kim explained. But while it is commonly employed in Europe to treat joint pain, its use in other countries is limited by low awareness and a lack of high-quality randomized evidence. To address this shortfall, Kim and colleagues performed a randomized, placebo-controlled trial designed to provide sufficient evidence to incorporate LDRT into clinical standard-of-care.
-“There’s a clinical need for moderate interventions between weak pain medications and aggressive surgery, and we think radiation may be a suitable option for those patients, especially when drugs and injections are poorly tolerated,” said Kim.
-The multicentre trial included 114 patients with mild to moderate knee osteoarthritis. Participants were randomized to receive one of three treatments: 0.3 Gy radiotherapy in six fractions; 3 Gy in six fractions; or sham irradiation where the treatment system did not deliver radiation – an approach that had not been tested in previous studies.
-The use of pain medication was limited, to avoid masking effects from the radiation itself. Response was considered positive if the patients (who did not know which treatment they had received) exhibited improvements in pain levels, physical function and overall condition.
-“Interestingly, at one month [after treatment], the response rates were very similar across all groups, which reflects a strong placebo effect from the sham group,” said Kim. “At four months, after the placebo effect had diminished, the 3 Gy group demonstrated significantly higher response rate compared to the sham control group; however, the 0.3 Gy group did not.”
- -The response rates at four months were 70.3%, 58.3% and 41.7%, for the 3 Gy, 0.3 Gy and sham groups, respectively. As expected, with radiation doses less than 5% of those typically used for cancer treatments, no radiation-related side effects were observed.
-“Our study shows that a single course of low-dose radiotherapy improves knee osteoarthritis symptoms and function at four months, with no treatment-related toxicity observed,” Kim concluded. “So our trial could provide objective evidence and suggest that LDRT is a non-pharmacologic scalable option that merits further trials.”
-“While small, [the study] was really well executed in terms of being placebo controlled. It clearly showed that the 3 Gy arm was superior to the placebo control arm and there was a 30% benefit,” commented Kristina Mirabeau-Beale from GenesisCare. “So I think we can say definitively that the benefit is from radiation more than just the placebo effect of interacting with our healthcare system.”
-The post ASTRO 2025: expanding the rules of radiation therapy appeared first on Physics World.
-]]>The post Quantum information or metamaterials: our predictions for this year’s Nobel Prize for Physics appeared first on Physics World.
-]]>
On Tuesday 7 October the winner(s) of the 2025 Nobel Prize for Physics will be announced. The process of choosing the winners is highly secretive, so looking for hints about who will be this year’s laureates is futile. Indeed, in the immediate run-up to announcement, only members of the Nobel Committee for Physics and the Class for Physics at the Royal Swedish Academy of Sciences know who will be minted as the latest Nobel laureates. What is more, recent prizes provide little guidance because the deliberations and nominations are kept secret for 50 years. So we really are in the dark when it comes to predicting who will be named next week.
-If you would like to learn more about how the Nobel Prize for Physics is awarded, check out this profile of Lars Brink, who served on the Nobel Committee for Physics on eight occasions.
-But this level of secrecy doesn’t stop people like me from speculating about this year’s winners. Before I explain the rather lovely infographic that illustrates this article – and how it could be used to predict future Nobel winners – I am going to share my first prediction for next week.
-Inspired by last year’s physics Nobel prize, which went to two computer scientists for their work on artificial intelligence, I am predicting that the 2025 laureates will be honoured for their work on quantum information and algorithms. Much of the pioneering work in this field was done several decades ago, and has come to fruition in functioning quantum computers and cryptography systems. So the time seems right for an award and I have four people in mind. They are Peter Shor, Gilles Brassard, Charles Bennett and David Deutsch. However, only three can share the prize.
-Moving on to our infographic, which gives a bit of pseudoscientific credibility to my next predictions! It charts the history of the physics Nobel prize in terms of field of endeavour. One thing that is apparent from the infographic is that since about 1990 there have been clear gaps between awards in certain fields. If you look at “atomic, molecular and optical physics”, for example, there are gaps between awards of about 5–10 years. One might conclude, therefore, that the Nobel committee considers the field of an award and tries to avoid bunching together awards in the same field.
-Looking at the infographic, it looks like we are long overdue a prize in nuclear and particle physics – the last being 10 years ago. However, we haven’t had many big breakthroughs in this field lately. Two aspects of particle physics that have been very fruitful in the 21st century have been the study of the quark–gluon plasma formed when heavy nuclei collide; and the precise study of antimatter – observing how it behaves under gravity, for example. But I think it might be a bit too early for Nobels in these fields.
-One possibility for a particle-physics Nobel is the development of the theory of cosmic inflation, which seeks to explain the observed nature of the current universe by invoking an exponential expansion of the universe in its very early history. If an award were given for inflation, it would most certainly go to Alan Guth and Andrei Linde. A natural for the third slot would have been Alexei Starobinsky, who sadly died in 2023 – and Nobels are not awarded posthumously. If there was a third winner for inflation, it would probably be Paul Steinhardt.
-2016 was the last year when we had a Nobel prize in condensed-matter physics, so what work in that field would be worthy of an award this year? There has been a lot of very interesting research done in the field of metamaterials – materials that are engineered to have specific properties, particularly in terms of how they interact with light or sound.
-A Nobel prize for metamaterials would surely go to the theorist John Pendry, who pioneered the concept of transformation optics. This simplifies our understanding of how light interacts with metamaterials and helps with the design of objects and devices with amazing properties. These include invisibility cloaks –the first of which was built in 2006 by the experimentalist David Smith, who I think is also a contender for this year’s Nobel prize. Smith’s cloak works at microwave frequencies, but my nomination for the third slot has done an amazing amount of work on developing metamaterials for practical applications in optics. If you follow this field, you know that I am thinking of the applied physicist Federico Capasso – who is also known for the invention of the quantum cascade laser.
-The post Quantum information or metamaterials: our predictions for this year’s Nobel Prize for Physics appeared first on Physics World.
-]]>