diff --git "a/raw_rss_feeds/https___physicsworld_com_feed_.xml" "b/raw_rss_feeds/https___physicsworld_com_feed_.xml" --- "a/raw_rss_feeds/https___physicsworld_com_feed_.xml" +++ "b/raw_rss_feeds/https___physicsworld_com_feed_.xml" @@ -14,9 +14,9 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Physics World - https://physicsworld.com/a/the-obscure-physics-theory-that-helped-chinese-science-emerge-from-the-shadows/ + https://physicsworld.com/a/cosmic-time-capsules-the-search-for-pristine-comets/ - Wed, 21 Jan 2026 11:02:40 +0000 + Fri, 23 Jan 2026 13:42:28 +0000 en-GB Copyright by IOP Publishing Ltd and individual contributors hourly @@ -48,13 +48,277 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" + Cosmic time capsules: the search for pristine comets + https://physicsworld.com/a/cosmic-time-capsules-the-search-for-pristine-comets/ + Fri, 23 Jan 2026 13:40:59 +0000 + + + + https://physicsworld.com/?p=126073 + Ancient icy wanderers like comet 3I/ATLAS, spotted in July 2025, can reveal secrets of our cosmic origins

+

The post Cosmic time capsules: the search for pristine comets appeared first on Physics World.

+]]>
+ + +

In this episode of Physics World Stories, host Andrew Glester explores the fascinating hunt for pristine comets – icy bodies that preserve material from the solar system’s beginnings and even earlier. Unlike more familiar comets that repeatedly swing close to the Sun and transform, these frozen relics act as time capsules, offering unique insights into our cosmic history.

+ +
Pale blue circle against red streaks. composite image of interstellar comet 3I/ATLAS captured by the Europa Ultraviolet Spectrograph instrument on NASA’s Europa Clipper spacecraft
+
+
+

The first guest is Tracy Becker, deputy principal investigator for the Ultraviolet Spectrograph on NASA’s Europa Clipper mission. Becker describes how the Jupiter-bound spacecraft recently turned its gaze to 3I/ATLAS, an interstellar visitor that appeared last July. Mission scientists quickly reacted to this unique opportunity, which also enabled them to test the mission’s instruments before it arrives at the icy world of Europa.

+

Michael Küppers then introduces the upcoming Comet Interceptor mission, set for launch in 2029. This joint ESA–JAXA mission will “park” in space until a suitable comet arrives from the outer reaches of the solar system. They will deploy two probes to study it from multiple angles – offering a first-ever close look at material untouched since the solar system’s birth.

+ +
+
+

From interstellar wanderers to carefully orchestrated intercepts, this episode blends pioneering missions and cosmic detective work. Keep up to date with all the latest space and astronomy developments in the dedicated section of the Physics World website.

+
+

The post Cosmic time capsules: the search for pristine comets appeared first on Physics World.

+]]>
+ + Physics World + + Cosmic time capsules: the search for pristine comets + full + 51:40 + +Podcasts +Ancient icy wanderers like comet 3I/ATLAS, spotted in July 2025, can reveal secrets of our cosmic origins +https://physicsworld.com/wp-content/uploads/2026/01/hubble-3i-atlas-scaled.jpg +
+ + Hot ancient galaxy cluster challenges current cosmological models + https://physicsworld.com/a/hot-ancient-galaxy-cluster-challenges-current-cosmological-models/ + Fri, 23 Jan 2026 11:30:13 +0000 + + + + https://physicsworld.com/?p=126130 + Observations of the thermal energy in a baby galaxy cluster 12.4 billion light years away suggest much more energetic early cluster growth than current theories assume

+

The post Hot ancient galaxy cluster challenges current cosmological models appeared first on Physics World.

+]]>
+ As with people, age in cosmology does not always extrapolate. An early-career politician may be more likely to win a debate with a student than with a seasoned diplomat, but put all three in a room with a toddler and the toddler will almost certainly get their own way – they are following a different set of rules. A team of global collaborators noticed a similar phenomenon when peering at a cluster of developing galaxies from a time when the universe was just a tenth of its current age.

+

Cosmological theories suggest that such infant clusters should host much cooler and less abundant gas than more mature clusters. But what the researchers saw was at least five times hotter than expected – apparently not abiding by those rules.

+

“That’s a massive surprise and forces us to rethink how large structures actually form and evolve in the universe,” says first author Dazhi Zhou, a PhD candidate at the University of British Columbia.

+

Eyes on the past

+

Looking into distant outer space allows us to peer into the past. The protocluster of developing galaxies that Zhou and collaborators investigated – known as SPT2349–56 – is 12.4 billion light years away, so the light observed from it left home when the universe was just 1.4 billion years old. Light from so far away will be quite faint and hard to detect by the time it reaches us, so the researchers used the Atacama Large Millimeter/submillimeter Array (ALMA) to study SPT2349–56 using a special type of shadow.

+ +

As this type of protocluster develops, Zhou explains, the gas around its galaxies  becomes so hot that electrons in the gas interact with, and confer some of their energy upon, passing photons. This leaves light passing through the gas with more photons at the higher energy end of the spectrum and fewer at the lower end. When viewing the cosmic microwave background radiation – the “afterglow” left behind by the Big Bang – this results in a shadow at low energies. This energy shift, discovered by physicists Rashid Sunyaev and Yakov Zeldovich, not only reveals the presence of the protocluster, but the strength of this signature indicates the thermal energy of the gas in the protocluster.

+

The team’s observations were not easy. “This shadow is actually pretty tiny,” Zhou explains. In addition, there is thermal emission from the dust inside galaxies at radio wavelengths, originally estimated to be 20 times stronger than the Sunyaev–Zeldovich signature. “It really is like finding a needle in a haystack,” he adds. Nonetheless, the team did identify a definite Sunyaev–Zeldovich signature from SPT2349–56, with a thermal energy indicating that it was at least five times hotter than expected – thousands of times hotter than the surface of our Sun.

+

Time to upgrade?

+

SPT2349–56 has some quirks that may explain its high thermal energy, including three supermassive black holes shooting out jets of high-energy matter – a known but rare phenomenon for these supermassive black holes. However, simulations that take these outbursts into account as a heating mechanism that’s more efficient and occurs much earlier than heating from gravitational collapse (as current models suggest) still do not give the high temperatures observed, perhaps pointing to gaps in our knowledge of the underlying physics.

+ +

Eiichiro Komatsu from the Max-Planck-Institut für Astrophysik describes the work as “a wonderful  measurement”. Although not directly involved in this research, Komatsu has also looked at what the Sunyaev–Zeldovich effect can reveal about the cosmos. “The amount of thermal energy measured by the authors is staggering, yet its origin is a mystery,” he tells Physics World. He suggests these results will motivate further observations of other systems in the early universe.

+

“We need to be cautious rather than making any big claim,” adds Zhou. This is the first Sunyaev–Zeldovich detection of a protocluster from the first three billion years of the universe’s existence. Next, he aims to study similar protoclusters, and he hopes others will also work to corroborate the observations.

+

The research is reported in Nature.

+

The post Hot ancient galaxy cluster challenges current cosmological models appeared first on Physics World.

+]]>
+ Research update +Observations of the thermal energy in a baby galaxy cluster 12.4 billion light years away suggest much more energetic early cluster growth than current theories assume +https://physicsworld.com/wp-content/uploads/2026/01/23-01-26-galaxy-cluster.jpg +
+ + Laser fusion: Focused Energy charts a course to commercial viability + https://physicsworld.com/a/laser-fusion-focused-energy-charts-a-course-to-commercial-viability/ + Thu, 22 Jan 2026 15:01:44 +0000 + + + + + https://physicsworld.com/?p=126112 + Plasma physicist Debbie Callahan is our podcast guest

+

The post Laser fusion: Focused Energy charts a course to commercial viability appeared first on Physics World.

+]]>
+ This episode of the Physics World Weekly podcast features a conversation with the plasma physicist Debbie Callahan who is chief strategy officer at Focused Energy – a California and Germany based fusion-energy startup. Prior to that she spent 35 years working at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory in the US.

+ +

Focused Energy is developing a commercial system for generating energy from the laser-driven fusion of hydrogen isotopes. Callahan describes LightHouse, which is the company’s design for a laser-fusion power plant, and Pearl, which is the firm’s deuterium–tritium fuel capsule.

+

Callahan talks about the challenges and rewards of working in the fusion industry and also calls on early-career physicists to consider careers in this burgeoning sector.

+

The post Laser fusion: Focused Energy charts a course to commercial viability appeared first on Physics World.

+]]>
+ Podcasts +Plasma physicist Debbie Callahan is our podcast guest +https://physicsworld.com/wp-content/uploads/2026/01/22-1-25-debbie-callahan-list.jpg +
+ + Fuel cell catalyst requirements for heavy-duty vehicle applications + https://physicsworld.com/a/fuel-cell-catalyst-requirements-for-heavy-duty-vehicle-applications/ + Thu, 22 Jan 2026 11:25:19 +0000 + + + + + https://physicsworld.com/?p=125182 + Join the audience for a live webinar at 3 p.m. GMT/10 a.m. EST on 18 February 2026

+

Discover the realities and requirements of catalyst development for fuel cell applications

+

The post Fuel cell catalyst requirements for heavy-duty vehicle applications appeared first on Physics World.

+]]>
+

+

Heavy-duty vehicles (HDVs) powered by hydrogen-based proton-exchange membrane (PEM) fuel cells offer a cleaner alternative to diesel-powered internal combustion engines for decarbonizing long-haul transportation sectors. The development path of sub-components for HDV fuel-cell applications is guided by the total cost of ownership (TCO) analysis of the truck.

+

TCO analysis suggests that the cost of the hydrogen fuel consumed over the lifetime of the HDV is more dominant because trucks typically operate over very high mileages (~a million miles) than the fuel cell stack capital expense (CapEx). Commercial HDV applications consume more hydrogen and demand higher durability, meaning that TCO is largely related to the fuel-cell efficiency and durability of catalysts.

+

This article is written to bridge the gap between the industrial requirements and academic activity for advanced cathode catalysts with an emphasis on durability. From a materials perspective, the underlying nature of the carbon support, Pt-alloy crystal structure, stability of the alloying element, cathode ionomer volume fraction, and catalyst–ionomer interface play a critical role in improving performance and durability.

+

We provide our perspective on four major approaches, namely, mesoporous carbon supports, ordered PtCo intermetallic alloys, thrifting ionomer volume fraction, and shell-protection strategies that are currently being pursued. While each approach has its merits and demerits, their key developmental needs for future are highlighted.

+ +
+

Nagappan Ramaswamy joined the Department of Chemical Engineering at IIT Bombay as a faculty member in January 2025. He earned his PhD in 2011 from Northeastern University, Boston specialising in fuel cell electrocatalysis.

+

He then spent 13 years working in industrial R&D – two years at Nissan North American in Michigan USA focusing on lithium-ion batteries, followed by 11 years at General Motors in Michigan USA focusing on low-temperature fuel cells and electrolyser technologies. While at GM, he led two multi-million-dollar research projects funded by the US Department of Energy focused on the development of proton-exchange membrane fuel cells for automotive applications.

+

At IIT Bombay, his primary research interests include low-temperature electrochemical energy-conversion and storage devices such as fuel cells, electrolysers and redox-flow batteries involving materials development, stack design and diagnostics.

+

The post Fuel cell catalyst requirements for heavy-duty vehicle applications appeared first on Physics World.

+]]>
+ Webinar +Join the audience for a live webinar at 3 p.m. GMT/10 a.m. EST on 18 February 2026 + +Discover the realities and requirements of catalyst development for fuel cell applications +https://physicsworld.com/wp-content/uploads/2025/11/2026-02-ecs-wb-feature-image.jpg +
+ + Ask me anything: Mažena Mackoit-Sinkevičienė – ‘Above all, curiosity drives everything’ + https://physicsworld.com/a/ask-me-anything-mazena-mackoit-sinkeviciene-above-all-curiosity-drives-everything/ + Thu, 22 Jan 2026 11:00:23 +0000 + + + + + https://physicsworld.com/?p=126019 + Mažena Mackoit-Sinkevičienė works in quantum optics and technology and is vice-president of the Lithuanian Physical Society

+

The post Ask me anything: Mažena Mackoit-Sinkevičienė – ‘Above all, curiosity drives everything’ appeared first on Physics World.

+]]>
+ What skills do you use every day in your job? +

Much of my time is spent trying to build and refine models in quantum optics, usually with just a pencil, paper and a computer. This requires an ability to sit with difficult concepts for a long time, sometimes far longer than is comfortable, until they finally reveal their structure.

+

Good communication is equally essential – I teach students; collaborate with colleagues from different subfields; and translate complex ideas into accessible language for the broader public. Modern physics connects with many different fields, so being flexible and open-minded matters as much as knowing the technical details. Above all, curiosity drives everything. When I don’t understand something, that uncertainty becomes my strongest motivation to keep going.

+

What do you like best and least about your job?

+

What I like the best is the sense of discovery – the moment when a problem that has evaded understanding for weeks suddenly becomes clear. Those flashes of insight feel like hearing the quiet whisper of nature itself. They are rare, but they bring along a joy that is hard to find elsewhere.

+ +

I also value the opportunity to guide the next generation of physicists, whether in the university classroom or through public science communication. Teaching brings a different kind of fulfilment: witnessing students develop confidence, curiosity and a genuine love for physics.

+

What I like the least is the inherent uncertainty of research. Questions do not promise favourable answers, and progress is rarely linear. Fortunately, I have come to see this lack of balance not as a weakness but as a source of power that forces growth, new perspectives, and ultimately deeper understanding.

+

What do you know today that you wish you knew when you were starting out in your career?

+

I wish I had known that feeling lost is not a sign of inadequacy but a natural part of doing physics at a high level. Not understanding something can be the greatest motivator, provided one is willing to invest time and effort. Passion and curiosity matter far more than innate brilliance. If I had realized earlier that steady dedication can carry you farther than talent alone, I would have embraced uncertainty with much more confidence.

+

The post Ask me anything: Mažena Mackoit-Sinkevičienė – ‘Above all, curiosity drives everything’ appeared first on Physics World.

+]]>
+ Interview +Mažena Mackoit-Sinkevičienė works in quantum optics and technology and is vice-president of the Lithuanian Physical Society +https://physicsworld.com/wp-content/uploads/2026/01/2026-01-ama-mazena-mackoit-sinkeviciene.jpg +newsletter
+ + Modelling wavefunction collapse as a continuous flow yields insights on the nature of measurement + https://physicsworld.com/a/modelling-wavefunction-collapse-as-a-continuous-flow-yields-insights-on-the-nature-of-measurement/ + Thu, 22 Jan 2026 09:30:19 +0000 + + + + https://physicsworld.com/?p=126064 + Quantum state diffusion framework makes it possible to characterize quantum measurement in terms of entropy production

+

The post Modelling wavefunction collapse as a continuous flow yields insights on the nature of measurement appeared first on Physics World.

+]]>
+ “God does not play dice.”

+

With this famous remark at the 1927 Solvay Conference, Albert Einstein set the tone for one of physics’ most enduring debates. At the heart of his dispute with Niels Bohr lay a question that continues to shape the foundations of physics: does the apparently probabilistic nature of quantum mechanics reflect something fundamental, or is it simply due to lack of information about some “hidden variables” of the system that we cannot access?

+

Physicists at University College London, UK (UCL) have now addressed this question via the concept of quantum state diffusion (QSD). In QSD, the wavefunction does not collapse abruptly. Instead, wavefunction collapse is modelled as a continuous interaction with the environment that causes the system to evolve gradually toward a definite state, restoring some degree of intuition to the counterintuitive quantum world.

+

A quantum coin toss

+

To appreciate the distinction (and the advantages it might bring), imagine tossing a coin. While the coin is spinning in midair, it is neither fully heads nor fully tails – its state represents a blend of both possibilities. This mirrors a quantum system in superposition.

+ +

When the coin eventually lands, the uncertainty disappears and we obtain a definite outcome. In quantum terms, this corresponds to wavefunction collapse: the superposition resolves into a single state upon measurement.

+

In the standard interpretation of quantum mechanics, wavefunction collapse is considered instantaneous. However, this abrupt transition is challenging from a thermodynamic perspective because uncertainty is closely tied to entropy. Before measurement, a system in superposition carries maximal uncertainty, and thus maximum entropy. After collapse, the outcome is definite and our uncertainty about the system is reduced, thereby reducing the entropy.

+

This apparent reduction in entropy immediately raises a deeper question. If the system suddenly becomes more ordered at the moment of measurement, where does the “missing” entropy go?

+

From instant jumps to continuous flows

+

Returning to the coin analogy, imagine that instead of landing cleanly and instantly revealing heads or tails, the coin wobbles, leans, slows and gradually settles onto one face. The outcome is the same, but the transition is continuous rather than abrupt.

+

This gradual settling captures the essence of QSD. Instead of an instantaneous “collapse”, the quantum state unfolds continuously over time. This makes it possible to track various parameters of thermodynamic change, including a quantity called environmental stochastic entropy production that measures how irreversible the process is.

+

Another benefit is that whereas standard projective measurements describe an abrupt “yes/no” outcome, QSD models a broader class of generalized or “weak” measurements, revealing the subtle ways quantum systems evolve. It also allows physicists to follow individual trajectories rather than just average outcomes, uncovering details that the standard framework smooths over.

+

“The QSD framework helps us understand how unpredictable environmental influences affect quantum systems,” explains Sophia Walls, a PhD student at UCL and the first author of a paper in Physical Review A on the research. Environmental noise, Walls adds, is particularly important for quantum technologies, making the study’s insights valuable for quantum error correction, control protocols and feedback mechanisms.

+

Bridging determinism and probability

+

At first glance, QSD might seem to resemble decoherence, which also arises from system–environment interactions such as noise. But the two differ in scope. “Decoherence explains how a system becomes a classical mixed state,” Walls clarifies, “but not how it ultimately purifies into a single eigenstate.” QSD, with its stochastic term, describes this final purification – the point where the coin’s faint shimmer sharpens into heads or tails.

+

In this view, measurement is not a single act but a continuous, entropy-producing flow of information between system and environment – a process that gradually results in manifestation of one of the possible quantum states, rather than an abrupt “collapse”.

+

“Standard quantum mechanics separates two kinds of dynamics – the deterministic Schrödinger evolution and the probabilistic, instantaneous collapse,” Walls notes. “QSD connects both in a single dynamical equation, offering a more unified description of measurement.”

+ +

This continuous evolution makes otherwise intractable quantities, such as entropy production, measurable and meaningful. It also breathes life into the wavefunction itself. By simulating individual realizations, QSD distinguishes between two seemingly identical mixed states: one genuinely entangled with its environment, and another that simply represents our ignorance. Only in the first case does the system dynamically evolve – a distinction invisible in the orthodox picture.

+

A window on quantum gravity?

+

Could this diffusion-based framework also illuminate other fundamental questions beyond the nature of measurement? Walls thinks it’s possible. Recent work suggests that stochastic processes could provide experimental clues about how gravity behaves at the quantum scale. QSD may one day offer a way to formalize or test such ideas. “If the nature of quantum gravity can be studied through a diffusive or stochastic process, then QSD would be a relevant framework to explore it,” Walls says.

+

The post Modelling wavefunction collapse as a continuous flow yields insights on the nature of measurement appeared first on Physics World.

+]]>
+ Research update +Quantum state diffusion framework makes it possible to characterize quantum measurement in terms of entropy production +https://physicsworld.com/wp-content/uploads/2026/01/22-01-2026-spinning-quantum-coin.png +newsletter
+ + NPL unveils miniature atomic fountain clock   + https://physicsworld.com/a/npl-unveils-miniature-atomic-fountain-clock/ + Wed, 21 Jan 2026 17:23:16 +0000 + + + + https://physicsworld.com/?p=126087 + Precision timekeeper is just 5% the size of a conventional clock

+

The post NPL unveils miniature atomic fountain clock   appeared first on Physics World.

+]]>
+ A miniature version of an atomic fountain clock has been unveiled by researchers at the UK’s National Physical Laboratory (NPL). Their timekeeper occupies just 5% of the volume of a conventional atomic fountain clock while delivering a time signal with a stability that is on par with a full-sized system. The team is now honing its design to create compact fountain clocks that could be used in portable systems and remote locations.

+ +

The ticking of an atomic clock is defined by the frequency of the electromagnetic radiation that is absorbed and emitted by a specific transition between atomic energy levels. Today, the second is defined using a transition in caesium atoms that involves microwave radiation. Caesium atoms are placed in a microwave cavity and a measurement-and-feedback mechanism is used to tune the frequency of the cavity radiation to the atomic transition – creating a source of microwaves with a very narrow frequency range centred at the clock frequency.

+

The first atomic clocks sent a fast-moving beam of atoms through a microwave cavity. The precision of such a beam clock is limited by the relatively short time that individual atoms spend in the cavity. Also, the speed of the atoms means that the measured frequency peak is shifted and broadened by the Doppler effect.

+

Launching atoms

+

These problems were addressed by the development of the fountain clock, in which the atoms are cooled (slowed down) by laser light, which also launches the atoms upwards. The atoms pass through a microwave cavity on the way up, and again as they fall back down. The atoms travel at much slower speeds than in a beam clock. The atoms spend much more time in the cavity and therefore the time signal from an atomic clock is much more precise than a beam clock. However, long times result in greater thermal spread of the atomic beam – which degrades clock performance. Trading-off measurement time with thermal spread means that the caesium fountain clocks that currently define the second have drops of about 30 cm.

+

Other components are also needed to operate fountain clocks – including a vacuum system and laser and microwave instrumentation. This pushes the height of a typical clock to about 2 m, and makes it a complex and expensive instrument that cannot be easily transported.

+

Now, Sam Walby and colleagues at NPL have shrunk the overall height of a rubidium-based fountain clock to 80 cm, while retaining the 30 cm drop. The result is an instrument that is 5% the volume of one of NPL’s conventional caesium atomic fountain clocks.

+

Precise yet portable

+

“That’s taking it from barely being able to fit though a doorway, to something one could pick up and carry with one arm,” says Walby.

+

Despite the miniaturization, the mini-fountain achieved a stability of one part in 1015 after several days of operation – which NPL says is comparable to full-sized clocks.

+

Walby told Physics World that the NPL team achieved miniaturization by eliminating two conventional components from their clock design. One is a dedicated chamber used to measure the quantum states of the atoms. Instead, this measurement is make within the clock’s cooling chamber. Also eliminated is a dedicated state-selection microwave cavity, which puts the atoms into the quantum state from which the clock transition occurs.

+

“The mini-fountain also does this [state] selection,” explains Walby, “but instead of using a dedicated cavity, we use a coax-to-waveguide adapter that is directed into the cooling chamber, which creates a travelling wave of microwaves at the correct frequency.”

+

The NPL team also reduced the amount of magnetic shielding used, which meant that the edge-effects of the magnetic field had to be more carefully considered. The optics system of the clock was greatly simplified and the use of commercial components mean that the clock is low maintenance and easy to operate – according to NPL.

+

Radical simplification

+

“By radically simplifying and shrinking the atomic fountain, we’re making ultra-precise timing technology available beyond national labs,” said Walby. “This opens new possibilities for resilient infrastructure and next-generation navigation.”

+

According to Walby, one potential use of a miniature atomic fountain clock is as a holdover clock. These are devices that produce a very stable time signal when not synchronized with other atomic clocks. This is important for creating resilience in infrastructure that relies on precision timing – such as communications networks, global navigation satellite systems (including GPS) and power grids. Synchronization is usually done using GNSS signals but these can be jammed or spoofed to disrupt timing systems.

+

Holdover clocks require time errors of just a few nanoseconds over a month, which the new NPL clock can deliver. The miniature atomic clock could also be used as a secondary frequency standard for the SI second.

+

The small size of the clock also lends itself to portable and even mobile applications, according to Walby: “The adaptation of the mini-fountain technology to mobile platforms will be subject of further developments”.

+

However, the mini-clock is large when compared to more compact or chip-based clocks – which do not perform as well. Therefore, he believes that the technology is more likely to be implemented on ships or ground vehicles than aircraft.

+

“At a minimum, it should be easily transportable compared to the current solutions of similar performance,” he says.

+

“Highly innovative”

+

Atomic-clock expert Elizabeth Donley tells Physics World, “NPL has been highly innovative in recent years in standardizing fountain clock designs and even supplying caesium fountains to other national standards labs and organizations around the world for timekeeping purposes. This new compact rubidium fountain is a continuation of this work and can provide a smaller frequency standard with comparable performance to the larger fountains based on caesium.”

+

Donley spent more than two decades developing atomic clocks at the US National Institute of Standards and Technology (NIST) and now works as a consultant in the field. She agrees that miniature fountain clocks would be useful for holding-over timing information when time signals are interrupted.

+

She adds, “Once the international community decides to redefine the second to be based on an optical transition, it won’t matter if you use rubidium or caesium. So I see this work as more of a practical achievement than a ground-breaking one. Practical achievements are what drives progress most of the time.”

+

The new clock is described in Applied Physics Letters.

+

The post NPL unveils miniature atomic fountain clock   appeared first on Physics World.

+]]>
+ Research update +Precision timekeeper is just 5% the size of a conventional clock +https://physicsworld.com/wp-content/uploads/2026/01/21-1-26-miniature-atomic-fountain-clock.jpg +newsletter
+ + Shining laser light on a material produces subtle changes in its magnetic properties + https://physicsworld.com/a/shining-laser-light-on-a-material-produces-subtle-changes-in-its-magnetic-properties/ + Wed, 21 Jan 2026 14:00:49 +0000 + + + + + https://physicsworld.com/?p=126080 + New use for photolithography could have applications for data storage

+

The post Shining laser light on a material produces subtle changes in its magnetic properties appeared first on Physics World.

+]]>
+ Researchers in Switzerland have found an unexpected new use for an optical technique commonly used in silicon chip manufacturing. By shining a focused laser beam onto a sample of material, a team at the Paul Scherrer Institute (PSI) and ETH Zürich showed that it was possible to change the material’s magnetic properties on a scale of nanometres – essentially “writing” these magnetic properties into the sample in the same way as photolithography etches patterns onto wafers. The discovery could have applications for novel forms of computer memory as well as fundamental research.

+

In standard photolithography – the workhorse of the modern chip manufacturing industry – a light beam passes through a transmission mask and projects an image of the mask’s light-absorption pattern onto a (usually silicon) wafer. The wafer itself is covered with a photosensitive polymer called a resist. Changing the intensity of the light leads to different exposure levels in the resist-covered material, making it possible to create finely detailed structures.

+

In the new work, Laura Heyderman and colleagues in PSI-ETH Zürich’s joint Mesoscopic System group began by placing a thin film of a magnetic material in a standard photolithography machine, but without a photoresist. They then scanned a focused laser beam over the surface of the sample while modulating the beam’s wavelength of 405 nm to deliver varying intensities of light. This process is known as direct write laser annealing (DWLA), and it makes it possible to heat areas of the sample that measure just 150 nm across.

+ +

In each heated area, thermal energy from the laser is deposited at the surface and partially absorbed by the film down to a depth of around 100 nm). The remainder dissipates through a silicon substrate coated in 300-nm-thick silicon oxide. However, the thermal conductivity of this substrate is low, which maximizes the temperature increase in the film for a given laser fluence. The researchers also sought to keep the temperature increase as uniform as possible by using thin-film heterostructures with a total thickness of less than 20 nm.

+

Crystallization and interdiffusion effects

+

Members of the PSI-ETH Zürich team applied this technique to several technologically important magnetic thin-film systems, including ferromagnetic CoFeB/MgO, ferrimagnetic CoGd and synthetic antiferromagnets composed of Co/Cr, Co/Ta or CoFeB/Pt/Ru. They found that DWLA induces both crystallization and interdiffusion effects in these materials. During crystallization, the orientation of the sample’s magnetic moments gradually changes, while interdiffusion alters the magnetic exchange coupling between the layers of the structures.

+

The researchers say that both phenomena could have interesting applications. The magnetized regions in the structures could be used in data storage, for example, with the direction of the magnetization (“up” or “down”) corresponding to the “1” or “0” of a bit of data. In conventional data-storage systems, these bits are switched with a magnetic field, but team member Jeffrey Brock explains that the new technique allows electric currents to be used instead. This is advantageous because electric currents are easier to produce than magnetic fields, while data storage devices switched with electricity are both faster and capable of packing more data into a given space.

+

Team member Lauren Riddiford says the new work builds on previous studies by members of the same group, which showed it was possible to make devices suitable for computer memory by locally patterning magnetic properties. “The trick we used here was to locally oxidize the topmost layer in a magnetic multilayer,” she explains. “However, we found that this works only in a few systems and only produces abrupt changes in the material properties. We were therefore brainstorming possible alternative methods to create gradual, smooth gradients in material properties, which would open possibilities to even more exciting applications and realized that we could perform local annealing with a laser originally made for patterning polymer resist layers for photolithography.”

+ +

Riddiford adds that the method proved so fast and simple to implement that the team’s main challenge was to investigate all the material changes it produced. Physical characterization methods for ultrathin films can be slow and difficult, she tells Physics World.

+

The researchers, who describe their technique in Nature Communications, now hope to use it to develop structures that are compatible with current chip-manufacturing technology. “Beyond magnetism, our approach can be used to locally modify the properties of any material that undergoes changes when heated, so we hope researchers using thin films for many different devices – electronic, superconducting, optical, microfluidic and so on – could use this technique to design desired functionalities,” Riddiford says. “We are looking forward to seeing where this method will be implemented next, whether in magnetic or non-magnetic materials, and what kind of applications it might bring.”

+

The post Shining laser light on a material produces subtle changes in its magnetic properties appeared first on Physics World.

+]]>
+ Research update +New use for photolithography could have applications for data storage +https://physicsworld.com/wp-content/uploads/2026/01/magnetic-landscapes.jpg +newsletter
+ The obscure physics theory that helped Chinese science emerge from the shadows https://physicsworld.com/a/the-obscure-physics-theory-that-helped-chinese-science-emerge-from-the-shadows/ Wed, 21 Jan 2026 11:00:52 +0000 - - + + https://physicsworld.com/?p=125852 Robert P Crease reveals the curious twist in the development of Chinese physics in the 1960s

The post The obscure physics theory that helped Chinese science emerge from the shadows appeared first on Physics World.

@@ -92,7 +356,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Opinion and reviews Robert P Crease reveals the curious twist in the development of Chinese physics in the 1960s https://physicsworld.com/wp-content/uploads/2026/01/mao-straton-pic-lighter.jpg -
+newsletter A surprising critical state emerges in active nematic materials https://physicsworld.com/a/a-surprising-critical-state-emerges-in-active-nematic-materials/ @@ -164,7 +428,18 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Holtkamp will oversee the completion of the $1.5bn Long-Baseline Neutrino Facility-Deep Underground Neutrino Experiment

The post Physicist Norbert Holtkamp takes over as head of Fermilab appeared first on Physics World.

]]>
- Particle physicist Norbert Holtkamp has been appointed the new director of Fermi National Accelerator Laboratory. He took up the position on 12 January, replacing Young-Kee Kim from the University of Chicago, who held the job on an interim basis following the resignation of Lia Merminga last year.

+ Norbert Holtkamp +

Particle physicist Norbert Holtkamp has been appointed the new director of Fermi National Accelerator Laboratory. He took up the position on 12 January, replacing Young-Kee Kim from the University of Chicago, who held the job on an interim basis following the resignation of Lia Merminga last year.

With a PhD in physics from the Technical University in Darmstadt, Germany, Holtkamp has managed large scientific projects throughout his career.

Holtkamp is the former deputy director of the SLAC National Accelerator Laboratory at Stanford University where he managedthe construction of the Linac Coherent Light Source upgrade, the world’s most powerful X-ray laser, along with more than $2bn of onsite construction projects.

Holtkamp also previously served as the principal deputy director general for the international fusion project ITER, which is currently under construction in Cadarache, France.

@@ -182,7 +457,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" ]]>
News Holtkamp will oversee the completion of the $1.5bn Long-Baseline Neutrino Facility-Deep Underground Neutrino Experiment -https://physicsworld.com/wp-content/uploads/2026/01/fermilab-new-director-holtkamp.jpg +https://physicsworld.com/wp-content/uploads/2026/01/holtkamp-lists.png
CERN accepts $1bn in private cash towards Future Circular Collider @@ -196,28 +471,28 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/"

The post CERN accepts $1bn in private cash towards Future Circular Collider appeared first on Physics World.

]]> The CERN particle-physics lab near Geneva has received $1bn from private donors towards the construction of the Future Circular Collider (FCC). The cash marks the first time in the lab’s 72-year history that individuals and philanthropic foundations have agreed to support a major CERN project. If built, the FCC would be the successor to the Large Hadron Collider (LHC), where the Higgs boson was discovered.

-

CERN originally released a four-volume conceptual design report for the FCC in early 2019, with more detail included in a three-volume feasibility study that came out last year. It calls for a giant tunnel some 90.7 km in circumference – roughly three times as long as the LHC  – that would be built about 200m underground on average.

+

CERN originally released a four-volume conceptual design report for the FCC in early 2019, with more detail included in a three-volume feasibility study that came out last year. It calls for a giant tunnel some 90.7 km in circumference – roughly three times as long as the LHC  – that would be built about 200 m underground on average.

The FCC has been recommended as the preferred option for the next flagship collider at CERN in the ongoing process to update the European Strategy for Particle Physics, which will be passed over to the  CERN Council in May 2026.If the plans are given the green light by CERN Council in 2028, construction on the FCC electron-positron machine, dubbed FCC-ee, would begin in 2030. It would start operations in 2047, a few years after the High Luminosity LHC (HL-LHC) closes down, and run for about 15 years until the early 2060s.

The FCC-ee would focus on creating a million Higgs particles in total to allow physicists to study its properties with an accuracy an order of magnitude better that possible with the LHC. The FCC feasibility study then calls for a hadron machine, dubbed FCC-hh, to replace the FCC-ee in the existing 91 km tunnel. It would be a “discovery machine”, smashing together protons at high energy – about 85 TeV – with the aim of creating new particles. If built, the FCC-hh will begin operation in 2073 and run to the end of the century.

The funding model for the FCC-ee, which is expected to have a price tag of about $18bn, is still a work in progress. But it is estimated that at least two-thirds of the construction costs will come from CERN’s 24 member states with the rest needing to be found elsewhere. One option to plug that gap is private donations and in late December CERN received a significant boost from several organizations including the Breakthrough Prize Foundation, the Eric and Wendy Schmidt Fund for Strategic Innovation, and the entrepreneurs John Elkann and Xavier Niel. Together, they pledged a total of $1bn towards the FCC-ee.

-

Costas Fountas, president of the CERN Council says CERN is “extremely grateful” for the interest. “This once again demonstrates CERN’s relevance and positive impact on society, and the strong interest in CERN’s future that exists well beyond our own particle physics community,” he notes.

+

Costas Fountas, president of the CERN Council, says CERN is “extremely grateful” for the interest. “This once again demonstrates CERN’s relevance and positive impact on society, and the strong interest in CERN’s future that exists well beyond our own particle physics community,” he notes.

Eric Schmidt, who founded Google, claims that he and Wendy Schmidt were “inspired by the ambition of this project and by what it could mean for the future of humanity”. The FCC, he believes, is an instrument that “could push the boundaries of human knowledge and deepen our understanding of the fundamental laws of the Universe” and could lead to technologies that could benefit society “in profound ways” from medicine to computing to sustainable energy.

-

The cash promised has been welcomed by outgoing CERN director-general Fabiola Gianotti. “It’s the first time in history that private donors wish to partner with CERN to build an extraordinary research instrument that will allow humanity to take major steps forward in our understanding of fundamental physics and the universe,” she said. “I am profoundly grateful to them for their generosity, vision, and unwavering commitment to knowledge and exploration.

+

The cash promised has been welcomed by outgoing CERN director-general Fabiola Gianotti. “It’s the first time in history that private donors wish to partner with CERN to build an extraordinary research instrument that will allow humanity to take major steps forward in our understanding of fundamental physics and the universe,” she said. “I am profoundly grateful to them for their generosity, vision, and unwavering commitment to knowledge and exploration.”

Further boost

-

The cash comes a few months after the Circular Electron–Positron Collider (CEPC) – a rival collider to the FCC-ee that also involves building a huge 100 km tunnel to study the Higgs in unprecedented detail – was not considered for inclusion in China’s next five-year plan, which runs from 2026–2030. There has been much discussion in China whether the CEPC is the right project for the country, with the collider facing criticism from particle physicist and Nobel laureate Chen-Ning Yang, before he died last year.

+

The cash comes a few months after the Circular Electron–Positron Collider (CEPC) – a rival collider to the FCC-ee that also involves building a huge 100 km tunnel to study the Higgs in unprecedented detail – was not considered for inclusion in China’s next five-year plan, which runs from 2026 to 2030. There has been much discussion in China about whether the CEPC is the right project for the country, with the collider facing criticism from particle physicist and Nobel laureate Chen-Ning Yang, before he died last year.

Wang Yifang of the Institute of High Energy Physics (IHEP) in Beijing says they will submit the CEPC for consideration again in 2030 unless FCC is officially approved before then. But for particle theorist John Ellis from Kings College London, China’s decision to effectively put the CEPC on the back burner  “certainly simplifies the FCC discussion”. “However, an opportunity for growing the world particle physics community has been lost, or at least deferred [by the decision],” Ellis told Physics World.

Ellis adds, however, that he would welcome China’s participation in the FCC. “Their accelerator and detector [technical design reviews] show that they could bring a lot to the table, if the political obstacles can be overcome,” he says.

However, if the FCC-ee goes ahead China could perhaps make significant “in-kind” contributions rather like those that occur with the ITER experimental fusion reactor, which is currently being built in France. In this case, instead of cash payments, the countries provide components, equipment and other materials.

Those considerations and more will now fall to the British physicist Mark Thomson, who took over from Gianotti as CERN director-general on 1 January for a five-year term. As well as working on funding requirements for the FCC-ee, top of his in-tray will actually be shutting down the LHC in June to make way for further work on the HL-LHC, which involves installing powerful new superconducting magnets and improving the detection.

-

About 90% of the 27 km LHC accelerator will be affected by the upgrade with a major part being to replace the magnets in the final focus systems of the two large experiments, ATLAS and CMS. These magnets will take the incoming beams and then focus them down to less than 10 microns in cross section. The upgrade includes the installation of brand new state-of-the-art niobium-tin (Nb3Sn) superconducting focusing magnets.

-

The HL-LHC will probably not turn on until 2030, which is when Thomson’s term will nearly be over but that doesn’t deter him from leading the world’s foremost particle-physics lab. “It’s an incredibly exciting project,” Thomson told the Guardian. “It’s more interesting than just sitting here with the machine hammering away.”

+

About 90% of the 27 km LHC accelerator will be affected by the upgrade with a major part being to replace the magnets in the final focus systems of the two large experiments, ATLAS and CMS. These magnets will take the incoming beams and then focus them down to less than 10 µm in cross section. The upgrade includes the installation of brand new state-of-the-art niobium-tin (Nb3Sn) superconducting focusing magnets.

+

The HL-LHC will probably not turn on until 2030, at which time Thomson’s term will nearly be over, but that doesn’t deter him from leading the world’s foremost particle-physics lab. “It’s an incredibly exciting project,” Thomson told the Guardian. “It’s more interesting than just sitting here with the machine hammering away.”

The post CERN accepts $1bn in private cash towards Future Circular Collider appeared first on Physics World.

]]>
Analysis Cash comes as Mark Thomson takes the reins at CERN https://physicsworld.com/wp-content/uploads/2026/01/cern-19-01-2026.jpg -
+newsletter Polarization-sensitive photoacoustic microscopy reveals heart tissue health https://physicsworld.com/a/polarization-sensitive-photoacoustic-microscopy-reveals-heart-tissue-health/ @@ -229,7 +504,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Label-free imaging technique can distinguish diseased cardiac tissue from healthy tissue and identify different types of fibrosis

The post Polarization-sensitive photoacoustic microscopy reveals heart tissue health appeared first on Physics World.

]]>
- MIR-DS-PAM images of fibrotic and normal cardiac tissue + MIR-DS-PAM images of fibrotic and normal cardiac tissue

Many of the tissues in the human body rely upon highly organized microstructures to function effectively. If the collagen fibres in heart muscle become disordered, for instance, this can lead to or reflect disorders such as fibrosis and cancer. To image and analyse such structural changes, researchers at Pohang University of Science and Technology (POSTECH) in Korea have developed a new label-free microscopy technique and demonstrated its use in engineered heart tissue.

The ability to assess the alignment of microstructures such as protein fibres within tissue’s extracellular matrix provides a valuable tool for diagnosing disease, monitoring therapy response and evaluating tissue engineering models. Currently, however, this is achieved using histological imaging methods based on immunofluorescent staining, which can be labour-intensive and sensitive to the imaging conditions and antibodies used.

Instead, a team headed up by Chulhong Kim and Jinah Jang is investigating photoacoustic microscopy (PAM), a label-free imaging modality that relies on light absorption by endogenous tissue chromophores to reveal structural and functional information. In particular, PAM with mid-infrared (MIR) incident light provides bond-selective, high-contrast imaging of proteins, lipids and carbohydrates. The researchers also incorporated dichroism-sensitive (DS) functionality, resulting in a technique referred to as MIR-DS-PAM.

@@ -254,7 +529,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Research update Label-free imaging technique can distinguish diseased cardiac tissue from healthy tissue and identify different types of fibrosis https://physicsworld.com/wp-content/uploads/2026/01/19-01-26-photoacoustic-microscopy-fig4-featured.jpg -
+newsletter Astronomer Daniel Jaffe named president of the Giant Magellan Telescope project https://physicsworld.com/a/astronomer-daniel-jaffe-named-president-of-the-giant-magellan-telescope-project/ @@ -266,11 +541,12 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Jaffe will be aiming to secure the funding necessary to complete the $2.5bn telescope

The post Astronomer Daniel Jaffe named president of the Giant Magellan Telescope project appeared first on Physics World.

]]>
- Astronomer Daniel Jaffe has been appointed the next president of the Giant Magellan Telescope Corporation –  the international consortium building the $2.5bn Giant Magellan Telescope (GMT). He succeeds Robert Shelton, who announced his retirement last year after eight years in the role.

-

A former head of astronomy at the University of Texas at Austin from 2011 to 2015, Jaffe became vice president for research at the university from 2016 to 2025 where he also served as interim provost from 2020 to 2021.

+ Daniel Jaffe +

Astronomer Daniel Jaffe has been appointed the next president of the Giant Magellan Telescope Corporation –  the international consortium building the $2.5bn Giant Magellan Telescope (GMT). He succeeds Robert Shelton, who announced his retirement last year after eight years in the role.

+

A former head of astronomy at the University of Texas at Austin from 2011 to 2015, Jaffe became vice-president for research at the university from 2016 to 2025 and he also served as interim provost from 2020 to 2021.

Jaffe has sat on the board of directors of the Association of Universities for Research in Astronomy and the Gemini Observatory and played a role in establishing the University of Texas at Austin’s partnership in the GMT.

-

Under construction in Chile and expected to be complete in the 2030s, the GMT consists of seven mirrors to create a 25.4 m telescope. From the ground it will produce images 4-16 times sharper than the James Webb Space Telescope and will investigate the origins of the chemical elements, and search for signs of life on distant planets.

+

Under construction in Chile and expected to be complete in the 2030s, the GMT consists of seven mirrors to create a 25.4 m telescope. From the ground it will produce images 4–16 times sharper than the James Webb Space Telescope and will investigate the origins of the chemical elements, and search for signs of life on distant planets.

“I am honoured to lead the GMT at this exciting stage,” notes Jaffe. “[It] represents a profound leap in our ability to explore the universe and employ a host of new technologies to make fundamental discoveries.”

“[Jaffe] brings decades of leadership in research, astronomy instrumentation, public-private partnerships, and academia,” noted Taft Armandroff, board chair of the GMTO Corporation. “His deep understanding of the Giant Magellan Telescope, combined with his experience leading large research enterprises and cultivating a collaborative environment, make him exceptionally well suited to lead the observatory through its next phase of construction and toward operations.”

Jaffe joins the GMT at a pivotal time, as it aims to secure the funding necessary to complete the telescope with just over $1bn from private funds having been pledges so far. The collaboration recently added Northwestern University and the Massachusetts Institute of Technology to its international consortium taking the number of members to 16 universities and research institutions.

@@ -281,8 +557,8 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" ]]>
News Jaffe will be aiming to secure the funding necessary to complete the $2.5bn telescope -https://physicsworld.com/wp-content/uploads/2026/01/daniel-jaffe-16-01-2025.jpg -
+https://physicsworld.com/wp-content/uploads/2026/01/daniel-jaffe-list.jpg +newsletter India turns to small modular nuclear reactors to meet climate targets https://physicsworld.com/a/india-turns-to-small-modular-nuclear-reactors-to-meet-climate-targets/ @@ -312,7 +588,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Analysis While SMRs could help meet climate targets there are concerns over their commercial viability   https://physicsworld.com/wp-content/uploads/2026/01/nuclear-plant-belgium-1007906419-shutterstock-engel-ac.jpg - +newsletter Gravitational lensing sheds new light on Hubble constant controversy https://physicsworld.com/a/gravitational-lensing-sheds-new-light-on-hubble-constant-controversy/ @@ -344,7 +620,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/" Research update Astronomers calculate new value for the universe's expansion https://physicsworld.com/wp-content/uploads/2026/01/gravitational-lenses.jpg - +newsletter RFID-tagged drug capsule lets doctors know when it has been swallowed https://physicsworld.com/a/rfid-tagged-drug-capsule-lets-doctors-know-when-it-has-been-swallowed/ @@ -360,7 +636,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/"

A team at Massachusetts Institute of Technology (MIT) has come up with a solution: a drug capsule containing an RFID tag that uses radiofrequency (RF) signals to communicate that it has been swallowed, and then bioresorbs into the body.

“Medication non-adherence remains a major cause of preventable morbidity and cost, but existing ingestible tracking systems rely on non-degradable electronics,” explains project leader Giovanni Traverso. “Our motivation was to create a passive, battery-free adherence sensor that confirms ingestion while fully biodegrading, avoiding long-term safety and environmental concerns associated with persistent electronic devices.”

The device – named SAFARI (smart adherence via Faraday cage and resorbable ingestible) – incorporates an RFID tag with a zinc foil RF antenna and an RF chip, as well as the drug payload, inside an ingestible gelatin capsule. The capsule is coated with a mixture of cellulose and molybdenum particles, which blocks the transit of any RF signals.

-
SAFARI capsules with and without RF-blocking coating
+
SAFARI capsules with and without RF-blocking coating

Once swallowed, however, this shielding layer breaks down in the stomach. The RFID tag (which can be preprogrammed with information such as dose metadata, manufacturing details and unique ID) can then be wirelessly queried by an external reader and return a signal from inside the body confirming that the medication has been ingested.

The capsule itself dissolves upon exposure to digestive fluids, releasing the desired medication; the  metal antenna components also dissolve completely in the stomach. The use of biodegradable materials is key as it eliminates the need for device retrieval and minimizes the risk of gastrointestinal (GI) blockage. The tiny (0.16 mm²) RFID chip remains intact and should safely leave the body through the GI tract.

Traverso suggests that the first clinical applications for the SAFARI capsule will likely be high-risk settings in which objective ingestion confirmation is particularly valuable. “[This includes] tuberculosis, HIV, transplant immunosuppression or cardiovascular therapies, where missed doses can have serious clinical consequences,” he tells Physics World.

@@ -509,7 +785,7 @@ xmlns:rawvoice="https://blubrry.com/developer/rawvoice-rss/"

Here are some of the product innovations on show at this year’s event.

Enabling high-performance photonics assembly with SmarAct

As photonics applications increasingly require systems with high complexity and integration density, manufacturers face a common challenge: how to assemble, align and test optical components with nanometre precision – quickly, reliably and at scale. At Photonics West, SmarAct presents a comprehensive technology portfolio addressing exactly these demands, spanning optical assembly, fast photonics alignment, precision motion and advanced metrology.

-
SmarAct’s photonics assembly portfolio
+
SmarAct’s photonics assembly portfolio

A central highlight is SmarAct’s Optical Assembly Solution, presented together with a preview of a powerful new software platform planned for release in late-Q1 2026. This software tool is designed to provide exceptional flexibility for implementing automation routines and process workflows into user-specific control applications, laying the foundation for scalable and future-proof photonics solutions.

For high-throughput applications, SmarAct showcases its Fast Photonics Alignment capabilities. By combining high-dynamic motion systems with real-time feedback and controller-based algorithms, SmarAct enables rapid scanning and active alignment of PICs and optical components such as fibres, fibre array units, lenses, beam splitters and more. These solutions significantly reduce alignment time while maintaining sub-micrometre accuracy, making them ideal for demanding photonics packaging and assembly tasks.

Both the Optical Assembly Solution and Fast Photonics Alignment are powered by SmarAct’s electromagnetic (EM) positioning axes, which form the dynamic backbone of these systems. The direct-drive EM axes combine high speed, high force and exceptional long-term durability, enabling fast scanning, smooth motion and stable positioning even under demanding duty cycles. Their vibration-free operation and robustness make them ideally suited for high-throughput optical assembly and alignment tasks in both laboratory and industrial environments.

@@ -1802,8 +2078,7 @@ ZAP-X represents the second cranial radiosurgery revolution, setting new standar https://physicsworld.com/?p=125024 - Join the audience for a live webinar at 3 p.m. GMT/10 a.m. EST on 21 January 2026

-

Discover the role of impedance analysis in advancing battery-model development

+ Discover the role of impedance analysis in advancing battery-model development

The post Physics-based battery model parameterization from impedance data appeared first on Physics World.

]]>
The post Physics-based battery model parameterization from impedance data appeared first on Physics World.

]]>
Webinar -Join the audience for a live webinar at 3 p.m. GMT/10 a.m. EST on 21 January 2026 - -Discover the role of impedance analysis in advancing battery-model development +Discover the role of impedance analysis in advancing battery-model development https://physicsworld.com/wp-content/uploads/2025/11/2026-01-ecs-wb-feature-image.jpg
@@ -5406,343 +5679,5 @@ Discover the role of impedance analysis in advancing battery-model developmentTrade-off between precision and entropy production lies in measurement process https://physicsworld.com/wp-content/uploads/2025/11/11-11-2025-quantum-clock-ticks_web.png - - The forgotten pioneers of computational physics - https://physicsworld.com/a/the-forgotten-pioneers-of-computational-physics/ - Tue, 11 Nov 2025 10:00:50 +0000 - - - - - https://physicsworld.com/?p=124570 - Iulia Georgescu highlights the forgotten pioneers of computational physics and calls for a wider appreciation of research software engineers

-

The post The forgotten pioneers of computational physics appeared first on Physics World.

-]]>
- When you look back at the early days of computing, some familiar names pop up, including John von Neumann, Nicholas Metropolis and Richard Feynman. But they were not lonely pioneers – they were part of a much larger group, using mechanical and then electronic computers to do calculations that had never been possible before.

-

These people, many of whom were women, were the first scientific programmers and computational scientists. Skilled in the complicated operation of early computing devices, they often had degrees in maths or science, and were an integral part of research efforts. And yet, their fundamental contributions are mostly forgotten.

-

This was in part because of their gender – it was an age when sexism was rife, and it was standard for women to be fired from their job after getting married. However, there is another important factor that is often overlooked, even in today’s scientific community – people in technical roles are often underappreciated and underacknowledged, even though they are the ones who make research possible.

-

Human and mechanical computers

-

Originally, a “computer” was a human being who did calculations by hand or with the help of a mechanical calculator. It is thought that the world’s first computational lab was set up in 1937 at Columbia University. But it wasn’t until the Second World War that the demand for computation really exploded; with the need for artillery calculations, new technologies and code breaking.

-
Three women in a basement lab performing calculations by hand
-

In the US, the development of the atomic bomb during the Manhattan Project (established in 1943) required huge computational efforts, so it wasn’t long before the New Mexico site had a hand-computing group. Called the T-5 group of the Theoretical Division, it initially consisted of about 20 people. Most were women, including the spouses of other scientific staff. Among them was Mary Frankel, a mathematician married to physicist Stan Frankel; mathematician Augusta “Mici” Teller who was married to Edward Teller, the “father of the hydrogen bomb”; and Jean Bacher, the wife of physicist Robert Bacher.

-

As the war continued, the T-5 group expanded to include civilian recruits from the nearby towns and members of the Women’s Army Corps. Its staff worked around the clock, using printed mathematical tables and desk calculators in four-hour shifts – but that was not enough to keep up with the computational needs for bomb development. In the early spring of 1944, IBM punch-card machines were brought in to supplement the human power. They became so effective that the machines were soon being used for all large calculations, 24 hours a day, in three shifts.

-

The computational group continued to grow, and among the new recruits were Naomi Livesay and Eleonor Ewing. Livesay held an advanced degree in mathematics and had done a course in operating and programming IBM electric calculating machines, making her an ideal candidate for the T-5 division. She in turn recruited Ewing, a fellow mathematician who was a former colleague. The two young women supervised the running of the IBM machines around the clock.

-

The frantic pace of the T-5 group continued until the end of the war in September 1945. The development of the atomic bomb required an immense computational effort, which was made possible through hand and punch-card calculations.

-

Electronic computers

-

Shortly after the war ended, the first fully electronic, general-purpose computer – the Electronic Numerical Integrator and Computer (ENIAC) – became operational at the University of Pennsylvania, following two years of development. The project had been led by physicist John Mauchly and electrical engineer J Presper Eckert. The machine was operated and coded by six women – mathematicians Betty Jean Jennings (later Bartik); Kathleen, or Kay, McNulty (later Mauchly, then Antonelli); Frances Bilas (Spence); Marlyn Wescoff (Meltzer) and Ruth Lichterman (Teitelbaum); as well as Betty Snyder (Holberton) who had studied journalism.

-
Two women adjusting switches on a large room-sized computer
-

Polymath John von Neumann also got involved when looking for more computing power for projects at the new Los Alamos Laboratory, established in New Mexico in 1947. In fact, although originally designed to solve ballistic trajectory problems, the first problem to be run on the ENIAC was “the Los Alamos problem” – a thermonuclear feasibility calculation for Teller’s group studying the H-bomb.

-

Like in the Manhattan Project, several husband-and-wife teams worked on the ENIAC, the most famous being von Neumann and his wife Klara Dán, and mathematicians Adele and Herman Goldstine. Dán von Neumann in particular worked closely with Nicholas Metropolis, who alongside mathematician Stanislaw Ulam had coined the term Monte Carlo to describe numerical methods based on random sampling. Indeed, between 1948 and 1949 Dán von Neumann and Metropolis ran the first series of Monte Carlo simulations on an electronic computer.

-

Work began on a new machine at Los Alamos in 1948 – the Mathematical Analyzer Numerical Integrator and Automatic Computer (MANIAC) – which ran its first large-scale hydrodynamic calculation in March 1952. Many of its users were physicists, and its operators and coders included mathematicians Mary Tsingou (later Tsingou-Menzel), Marjorie Jones (Devaney) and Elaine Felix (Alei); plus Verna Ellingson (later Gardiner) and Lois Cook (Leurgans).

-

Early algorithms

-

The Los Alamos scientists tried all sorts of problems on the MANIAC, including a chess-playing program – the first documented case of a machine defeating a human at the game. However, two of these projects stand out because they had profound implications on computational science.

-

In 1953 the Tellers, together with Metropolis and physicists Arianna and Marshall Rosenbluth, published the seminal article “Equation of state calculations by fast computing machines” (J. Chem. Phys. 21 1087). The work introduced the ideas behind the “Metropolis (later renamed Metropolis–Hastings) algorithm”, which is a Monte Carlo method that is based on the concept of “importance sampling”. (While Metropolis was involved in the development of Monte Carlo methods, it appears that he did not contribute directly to the article, but provided access to the MANIAC nightshift.) This is the progenitor of the Markov Chain Monte Carlo methods, which are widely used today throughout science and engineering.

-

Marshall later recalled how the research came about when he and Arianna had proposed using the MANIAC to study how solids melt (AIP Conf. Proc. 690 22).

-
Black and white photo of two men looking at a chess board on a table in front of large rack of computer switches
-

Edward Teller meanwhile had the idea of using statistical mechanics and taking ensemble averages instead of following detailed kinematics for each individual disk, and Mici helped with programming during the initial stages. However, the Rosenbluths did most of the work, with Arianna translating and programming the concepts into an algorithm.

-

The 1953 article is remarkable, not only because it led to the Metropolis algorithm, but also as one of the earliest examples of using a digital computer to simulate a physical system. The main innovation of this work was in developing “importance sampling”. Instead of sampling from random configurations, it samples with a bias toward physically important configurations which contribute more towards the integral.

-

Furthermore, the article also introduced another computational trick, known as “periodic boundary conditions” (PBCs): a set of conditions which are often used to approximate an infinitely large system by using a small part known as a “unit cell”. Both importance sampling and PBCs went on to become workhorse methods in computational physics.

-

In the summer of 1953, physicist Enrico Fermi, Ulam, Tsingou and physicist John Pasta also made a significant breakthrough using the MANIAC. They ran a “numerical experiment” as part of a series meant to illustrate possible uses of electronic computers in studying various physical phenomena.

-

The team modelled a 1D chain of oscillators with a small nonlinearity to see if it would behave as hypothesized, reaching an equilibrium with the energy redistributed equally across the modes (doi.org/10.2172/4376203). However, their work showed that this was not guaranteed for small perturbations – a non-trivial and non-intuitive observation that would not have been apparent without the simulations. It is the first example of a physics discovery made not by theoretical or experimental means, but through a computational approach. It would later lead to the discovery of solitons and integrable models, the development of chaos theory, and a deeper understanding of ergodic limits.

-

Although the paper says the work was done by all four scientists, Tsingou’s role was forgotten, and the results became known as the Fermi–Pasta–Ulam problem. It was not until 2008, when French physicist Thierry Dauxois advocated for giving her credit in a Physics Today article, that Tsingou’s contribution was properly acknowledged. Today the finding is called the Fermi–Pasta–Ulam–Tsingou problem.

-

The year 1953 also saw IBM’s first commercial, fully electronic computer – an IBM 701 – arrive at Los Alamos. Soon the theoretical division had two of these machines, which, alongside the MANIAC, gave the scientists unprecedented computing power. Among those to take advantage of the new devices were Martha Evans (whom very little is known about) and theoretical physicist Francis Harlow, who began to tackle the largely unexplored subject of computational fluid dynamics.

-

The idea was to use a mesh of cells through which the fluid, represented as particles, would move. This computational method made it possible to solve complex hydrodynamics problems (involving large distortions and compressions of the fluid) in 2D and 3D. Indeed, the method proved so effective that it became a standard tool in plasma physics where it has been applied to every conceivable topic from astrophysical plasmas to fusion energy.

-

The resulting internal Los Alamos report – The Particle-in-cell Method for Hydrodynamic Calculations, published in 1955 – showed Evans as first author and acknowledged eight people (including Evans) for the machine calculations. However, while Harlow is remembered as one of the pioneers of computational fluid dynamics, Evans was forgotten.

-

A clear-cut division of labour?

-

In an age where women had very limited access to the frontlines of research, the computational war effort brought many female researchers and technical staff in. As their contributions come more into the light, it becomes clearer that their role was not a simple clerical one.

-
Three black and white photos of people operating a large room-sized computer
-

There is a view that the coders’ work was “the vital link between the physicist’s concepts (about which the coders more often than not didn’t have a clue) and their translation into a set of instructions that the computer was able to perform, in a language about which, more often than not, the physicists didn’t have a clue either”, as physicists Giovanni Battimelli and Giovanni Ciccotti wrote in 2018 (Eur. Phys. J. H 43 303). But the examples we have seen show that some of the coders had a solid grasp of the physics, and some of the physicists had a good understanding of the machine operation. Rather than a skilled–non-skilled/men–women separation, the division of labour was blurred. Indeed, it was more of an effective collaboration between physicists, mathematicians and engineers.

-

Even in the early days of the T-5 division before electronic computers existed, Livesay and Ewing, for example, attended maths lectures from von Neumann, and introduced him to punch-card operations. As has been documented in books including Their Day in the Sun by Ruth Howes and Caroline Herzenberg, they also took part in the weekly colloquia held by J Robert Oppenheimer and other project leaders. This shows they should not be dismissed as mere human calculators and machine operators who supposedly “didn’t have a clue” about physics.

-

Verna Ellingson (Gardiner) is another forgotten coder who worked at Los Alamos. While little information about her can be found, she appears as the last author on a 1955 paper (Science 122 465) written with Metropolis and physicist Joseph Hoffman – “Study of tumor cell populations by Monte Carlo methods”. The next year she was first author of “On certain sequences of integers defined by sieves” with mathematical physicist Roger Lazarus, Metropolis and Ulam (Mathematics Magazine 29 117). She also worked with physicist George Gamow on attempts to discover the code for DNA selection of amino acids, which just shows the breadth of projects she was involved in.

-

Evans not only worked with Harlow but took part in a 1959 conference on self-organizing systems, where she queried AI pioneer Frank Rosenblatt on his ideas about human and machine learning. Her attendance at such a meeting, in an age when women were not common attendees, implies we should not view her as “just a coder”.

-

With their many and wide-ranging contributions, it is more than likely that Evans, Gardiner, Tsingou and many others were full-fledged researchers, and were perhaps even the first computational scientists. “These women were doing work that modern computational physicists in the [Los Alamos] lab’s XCP [Weapons Computational Physics] Division do,” says Nicholas Lewis, a historian at Los Alamos. “They needed a deep understanding of both the physics being studied, and of how to map the problem to the particular architecture of the machine being used.”

-

An evolving identity

-

Black and white photo of a woman using equipment to punch a program onto paper tape
-

-

In the 1950s there was no computational physics or computer science, therefore it’s unsurprising that the practitioners of these disciplines went by different names, and their identity has evolved over the decades since.

-

1930s–1940s

-

Originally a “computer” was a person doing calculations by hand or with the help of a mechanical calculator.

-

Late 1940s – early 1950s

-

A “coder” was a person who translated mathematical concepts into a set of instructions in machine language. John von Neumann and Herman Goldstine distinguished between “coding” and “planning”, with the former being the lower-level work of turning flow diagrams into machine language (and doing the physical configuration) while the latter did the mathematical analysis of the problem.

-

Meanwhile, an “operator” would physically handle the computer (replacing punch cards, doing the rewiring, etc). In the late-1940s coders were also operators.

-

As historians note in the book ENIAC in Action this was an age where “It was hard to devise the mathematical treatment without a good knowledge of the processes of mechanical computation…It was also hard to operate the ENIAC without understanding something about the mathematical task it was undertaking.”

-

For the ENIAC a “programmer” was not a person but “a unit combining different sequences in a coherent computation”. The term would later shift and eventually overlap with the meaning of coder as a person’s job.

-

1960s

-

Computer scientist Margaret Hamilton, who led the development of the on-board flight software for NASA’s Apollo program, coined the term “software engineering” to distinguish the practice of designing, developing, testing and maintaining software from the engineering tasks associated with the hardware.

-

1980s – early 2000s

-

Using the term “programmer” for someone who coded computers peaked in popularity in the 1980s, but by the 2000s was replaced in favour of other job titles such as various flavours of “developer” or “software architect”.

-

Early 2010s

-

A “research software engineer” is a person who combines professional software engineering expertise with an intimate understanding of scientific research.

-

-

-

Overlooked then, overlooked now

-

Credited or not, these pioneering women and their contributions have been mostly forgotten, and only in recent decades have their roles come to light again. But why were they obscured by history in the first place?

-

Secrecy and sexism seem to be the main factors at play. For example, Livesay was not allowed to pursue a PhD in mathematics because she was a woman, and in the cases of the many married couples, the team contributions were attributed exclusively to the husband. The existence of the Manhattan Project was publicly announced in 1945, but documents that contain certain nuclear-weapons-related information remain classified today. Because these are likely to remain secret, we will never know the full extent of these pioneers’ contributions.

-

But another often overlooked reason is the widespread underappreciation of the key role of computational scientists and research software engineers, a term that was only coined just over a decade ago. Even today, these non-traditional research roles end up being undervalued. A 2022 survey by the UK Software Sustainability Institute, for example, showed that only 59% of research software engineers were named as authors, with barely a quarter (24%) mentioned in the acknowledgements or main text, while a sixth (16%) were not mentioned at all.

-

The separation between those who understand the physics and those who write the code, understand and operate the hardware goes back to the early days of computing (see box above), but it wasn’t entirely accurate even then. People who implement complex scientific computations are not just coders or skilled operators of supercomputers, but truly multidisciplinary scientists who have a deep understanding of the scientific problems, mathematics, computational methods and hardware.

-

Such people – whatever their gender – play a key role in advancing science and yet remain the unsung heroes of the discoveries their work enables. Perhaps what this story of the forgotten pioneers of computational physics tells us is that some views rooted in the 1950s are still influencing us today. It’s high time we moved on.

-

The post The forgotten pioneers of computational physics appeared first on Physics World.

-]]>
- Feature -Iulia Georgescu highlights the forgotten pioneers of computational physics and calls for a wider appreciation of research software engineers -https://physicsworld.com/wp-content/uploads/2025/11/2025-11-georgescu-maniac-frontis.jpg -newsletter1
- - Classical gravity may entangle matter, new study claims - https://physicsworld.com/a/classical-gravity-may-entangle-matter-new-study-claims/ - Tue, 11 Nov 2025 08:30:02 +0000 - - - - - https://physicsworld.com/?p=124940 - Surprising result could guide searches for quantum gravity

-

The post Classical gravity may entangle matter, new study claims appeared first on Physics World.

-]]>
- Gravity might be able to quantum-entangle particles even if the gravitational field itself is classical. That is the conclusion of a new study by Joseph Aziz and Richard Howl at Royal Holloway University of London. This challenges a popular view that such entanglement would necessarily imply that gravity must be quantized. This could be important in the ongoing attempt to develop a theory of quantum gravity that unites quantum mechanics with Einstein’s general theory of relativity.

- -

“When you try to quantize the gravitational interaction in exactly the same way we tried to mathematically quantize the other forces, you end up with mathematically inconsistent results – you end up with infinities in your calculations that you can’t do anything about,” Howl tells Physics World.

-

“With the other interactions, we quantized them assuming they live within an independent background of classical space and time,” Howl explains. “But with quantum gravity, arguably you cannot do this [because] gravity describes space−time itself rather than something within space−time.”

-

Quantum entanglement occurs when two particles share linked quantum states even when separated. While it has become a powerful probe of the gravitational field, the central question is whether gravity can mediate entanglement only if it is itself quantum in nature.

-

General treatment

-

“It has generally been considered that the gravitational interaction can only entangle matter if the gravitational field is quantum,” Howl says. “We have argued that you could treat the gravitational interaction as more general than just the mediation of the gravitational field such that even if the field is classical, you could in principle entangle matter.”

-

Quantum field theory postulates that entanglement between masses arises through the exchange of virtual gravitons. These are hypothetical, transient quantum excitations of the gravitational field. Aziz and Howl propose that even if the field remains classical, virtual-matter processes can still generate entanglement indirectly. These processes, he says, “will persist even when the gravitational field is considered classical and could in principle allow for entanglement”.

-

The idea of probing the quantum nature of gravity through entanglement goes back to a suggestion by Richard Feynman in the 1950s. He envisioned placing a tiny mass in a superposition of two locations and checking whether its gravitational field was also superposed. Though elegant, the idea seemed untestable at the time.

-

Recent proposals − most notably by teams led by Sougato Bose and by Chiara Marletto and Vlatko Vedral – revived Feynman’s insight in a more practical form.

-

Feasible tests

-

“Recently, two proposals showed that one way you could test that the field is in a superposition (and thus quantum) is by putting two masses in a quantum superposition of two locations and seeing if they become entangled through the gravitational interaction,” says Howl. “This also seemed to be much more feasible than Feynman’s original idea.” Such experiments might use levitated diamonds, metallic spheres, or cold atoms – systems where both position and gravitational effects can be precisely controlled.

-

Aziz and Howl’s work, however, considers whether such entanglement could arise even if gravity is not quantum. They find that certain classical-gravity processes can in principle entangle particles, though the predicted effects are extremely small.

-

“These classical-gravity entangling effects are likely to be very small in near-future experiments,” Howl says. “This though is actually a good thing: it means that if we see entanglement…we can be confident that this means that gravity is quantized.”

-

The paper has drawn a strong response from some leading figures in the field, including Marletto at the University of Oxford, who co-developed the original idea of using gravitationally induced entanglement as a test of quantum gravity.

- -

“The phenomenon of gravitationally induced entanglement … is a game changer in the search for quantum gravity, as it provides a way to detect quantum effects in the gravitational field indirectly, with laboratory-scale equipment,” she says. Detecting it would, she adds, “constitute the first experimental confirmation that gravity is quantum, and the first experimental refutation of Einstein’s relativity as an adequate theory of gravity”.

-

However, Marletto disputes Aziz and Howl’s interpretation. “No classical theory of gravity can mediate entanglement via local means, contrary to what the study purports to show,” she says. “What the study actually shows is that a classical theory with direct, non-local interactions between the quantum probes can get them entangled.” In her view, that mechanism “is not new and has been known for a long time”.

-

Despite the controversy, Howl and Marletto agree that experiments capable of detecting gravitationally induced entanglement would be transformative. “We see our work as strengthening the case for these proposed experiments,” Howl says. Marletto concurs that “detecting gravitationally induced entanglement will be a major milestone … and I hope and expect it will happen within the next decade.”

-

Howl hopes the work will encourage further discussion about quantum gravity. “It may also lead to more work on what other ways you could argue that classical gravity can lead to entanglement,” he says.

-

The research is described in Nature.

-

The post Classical gravity may entangle matter, new study claims appeared first on Physics World.

-]]>
- Research update -Surprising result could guide searches for quantum gravity -https://physicsworld.com/wp-content/uploads/2025/11/10-11-25-quantum-gravity.jpg -newsletter1
- - Is Donald Trump conducting a ‘blitzkrieg’ on science? - https://physicsworld.com/a/is-donald-trump-conducting-a-blitzkrieg-on-science/ - Mon, 10 Nov 2025 15:00:06 +0000 - - - - - https://physicsworld.com/?p=124724 - The US High Energy Physics Advisory Panel has been dissolved for reasons of politics, not efficiency, says Robert P Crease

-

The post Is Donald Trump conducting a ‘blitzkrieg’ on science? appeared first on Physics World.

-]]>
- “Drain the swamp!”

-

In the intense first few months of his second US presidency, Donald Trump has been enacting his old campaign promise with a vengeance. He’s ridding all the muck from the American federal bureaucracy, he claims, and finally bringing it back under control.

-

Scientific projects and institutions are particular targets of his, with one recent casualty being the High Energy Physics Advisory Panel (HEPAP). Outsiders might shrug their shoulders at a panel of scientists being axed. Panels come and go. Also, any development in Washington these days is accompanied by confusion, uncertainty, and the possibility of reversal.

-

But HEPAP’s dissolution is different. Set up in 1967, it’s been a valuable and long-standing advisory committee of the Office of Science at the US Department of Energy (DOE). HEPAP has a distinguished track record of developing, supporting and reviewing high-energy physics programmes, setting priorities and balancing different areas. Many scientists are horrified by its axing.

-

The terminator

-

Since taking office in January 2025, Trump has issued a flurry of executive orders – presidential decrees that do not need Congressional approval, legislative review or public debate. One order, which he signed in February, was entitled “Commencing the Reduction of the Federal Bureaucracy”.

- -

It sought to reduce parts of the government “that the President has determined are unnecessary”, seeking to eliminate “waste and abuse, reduce inflation, and promote American freedom and innovation”. While supporters see those as laudable goals, opponents believe the order is driving a stake into the heart of US science.

-

Hugely valuable, long-standing scientific advisory committees have been axed at key federal agencies, including NASA, the National Science Foundation, the Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the US Geological Service, the National Institute of Health, the Food and Drug Administration, and the Centers for Disease Control and Prevention.

-

What’s more, the committees were terminated without warning or debate, eliminating load-bearing pillars of the US science infrastructure. It was, as the Columbia University sociologist Gil Eyal put it in a recent talk, the “Trump 2.0 Blitzkrieg”.

-

Then, on 30 September, Trump’s enablers took aim at advisory committees at the DOE Office of Science. According to the DOE’s website, a new Office of Science Advisory Committee (SCAC) will take over functions of the six former discretionary (non-legislatively mandated) Office of Science advisory committees.

-

“Any current charged responsibilities of these former committees will be transferred to the SCAC,” the website states matter-of-factly. The committee will provide “independent, consensus advice regarding complex scientific and technical issues” to the entire Office of Science. Its members will be appointed by under secretary for science Dario Gil – a political appointee.

-

Apart from HEPAP, others axed without warning were the Nuclear Science Advisory Committee, the Basic Energy Sciences Advisory Committee, the Fusion Energy Sciences Advisory Committee, the Advanced Scientific Computing Advisory Committee, and the Biological and Environmental Research Advisory Committee.

-

Over the years, each committee served a different community and was represented by prominent research scientists who were closely in touch with other researchers. Each committee could therefore assemble the awareness of – and technical knowledge about – emerging promising initiatives and identify the less promising ones.

-

Many committee members only learned of the changes when they received letters or e-mails out of the blue informing them that their committee had been dissolved, that a new committee had replaced them, and that they were not on it. No explanation was given.

-

Closing HEPAP and the other Office of Science committees will hamper both the technical support and community input that it has relied on to promote the efficient, effective and robust growth of physics

-
-

Physicists whom I have spoken to are appalled for two main reasons. One is that closing HEPAP and the other Office of Science committees will hamper both the technical support and community input that it has relied on to promote the efficient, effective and robust growth of physics.

-

“Speaking just for high-energy physics, HEPAP gave feedback on the DOE and NSF funding strategies and priorities for the high-energy physics experiments,” says Kay Kinoshita from the University of Cincinnati, a former HEPAP member. “The panel system provided a conduit for information between the agencies and the community, so the community felt heard and the agencies were (mostly) aligned with the community consensus”.

-

As Kinoshita continued: “There are complex questions that each panel has to deal with. even within the topical area. It’s hard to see how a broader panel is going to make better strategic decisions, ‘better’ meaning in terms of scientific advancement. In terms of community buy-in I expect it will be worse.”

-

Other physicists cite a second reason for alarm. The elimination of the advisory committees spreads the expertise so thinly as to increase the likelihood of political pressure on decisions. “If you have one committee you are not going to get the right kind of fine detail,” says Michael Lubell, a physicist and science-policy expert at the City College of New York, who has sat in on meetings of most of the Office of Science advisory committees.

-

“You’ll get opinions from people outside that area and you won’t be able to get information that you need as a policy maker to decide how the resources are to be allocated,” he adds. “A condensed-matter physicist for example, would probably have insufficient knowledge to advise DOE on particle physics. Instead, new committee members would be expected to vet programs based on ideological conformity to what the Administration wants.”

-

The critical point

-

At the end of the Second World War, the US began to construct an ambitious long-range plan to promote science that began with the establishment of the National Science Foundation in 1950 and developed and extended ever since. The plan aimed to incorporate both the ability of elected politicians to direct science towards social needs and the independence of scientists to explore what is possible.

- -

US presidents have, of course, had pet scientific projects: the War on Cancer (Nixon), the Moon Shot (Kennedy), promoting renewable energy (Carter), to mention a few. But it is one thing for a president to set science to producing a socially desirable product and another to manipulate the scientific process itself.

-

“This is another sad day for American science,” says Lubell. “If I were a young person just embarking on a career, I would get the hell out of the country. I would not want to waste the most creative years of my life waiting for things to turn around, if they ever do. What a way to destroy a legacy!”

-

The end of HEPAP is not draining a swamp but creating one.

-

The post Is Donald Trump conducting a ‘blitzkrieg’ on science? appeared first on Physics World.

-]]>
- Opinion and reviews -The US High Energy Physics Advisory Panel has been dissolved for reasons of politics, not efficiency, says Robert P Crease -https://physicsworld.com/wp-content/uploads/2025/11/2025-11-cp-hepap-panel-axed.jpg -newsletter
- - Delft Circuits, Bluefors: the engine-room driving joined-up quantum innovation - https://physicsworld.com/a/delft-circuits-bluefors-the-engine-room-driving-joined-up-quantum-innovation/ - Mon, 10 Nov 2025 09:48:06 +0000 - - - - - https://physicsworld.com/?p=124864 - Technology partners will focus on scalable cryogenic I/O cabling assemblies for next-generation quantum computing systems

-

The post Delft Circuits, Bluefors: the engine-room driving joined-up quantum innovation appeared first on Physics World.

-]]>
- delft-circuits-cri/oflex cabling technology -

Better together. That’s the headline take on a newly inked technology partnership between Bluefors, a heavyweight Finnish supplier of cryogenic measurement systems, and Delft Circuits, a Dutch manufacturer of specialist I/O cabling solutions designed for the scale-up and industrial deployment of next-generation quantum computers.

-

The drivers behind the tie-up are clear: as quantum systems evolve – think vastly increased qubit counts plus ever-more exacting requirements on gate fidelity – developers in research and industry will reach a point where current coax cabling technology doesn’t cut it anymore. The answer? Collaboration, joined-up thinking and product innovation.

-

In short, by integrating Delft Circuits’ Cri/oFlex® cabling technology into Bluefors’ dilution refrigerators, the vendors’ combined customer base will benefit from a complete, industrially proven and fully scalable I/O solution for their quantum systems. The end-game: to overcome the quantum tech industry’s biggest bottleneck, forging a development pathway from quantum computing systems with hundreds of qubits today to tens of thousands of qubits by 2030.

-

Joined-up thinking

-

For context, Cri/oFlex® cryogenic RF cables comprise a stripline (a type of transmission line) based on planar microwave circuitry – essentially a conducting strip encapsulated in dielectric material and sandwiched between two conducting ground planes. The use of the polyimide Kapton® as the dielectric ensures Cri/oFlex® cables remain flexible in cryogenic environments (which are necessary to generate quantum states, manipulate them and read them out), with silver or superconducting NbTi providing the conductive strip and ground layer. The standard product comes as a multichannel flex (eight channels per flex) with a range of I/O channel configurations tailored to the customer’s application needs, including flux bias lines, microwave drive lines, signal lines or read-out lines.

-
Robby Ferdinandus of Delft Circuits
-

“Reliability is a given with Cri/oFlex®,” says Robby Ferdinandus, global chief commercial officer for Delft Circuits and a driving force behind the partnership with Bluefors. “By integrating components such as attenuators and filters directly into the flex,” he adds, “we eliminate extra parts and reduce points of failure. Combined with fast thermalization at every temperature stage, our technology ensures stable performance across thousands of channels, unmatched by any other I/O solution.”

-

Technology aside, the new partnership is informed by a “one-stop shop” mindset, offering the high-density Cri/oFlex® solution pre-installed and fully tested in Bluefors cryogenic measurement systems. For the end-user, think turnkey efficiency: streamlined installation, commissioning, acceptance and, ultimately, enhanced system uptime.

-

Scalability is front-and-centre too, thanks to Delft Circuits’ pre-assembled and tested side-loading systems. The high-density I/O cabling solution delivers up to 50% more channels per side-loading port to Bluefors’ (current) High Density Wiring, providing a total of 1536 input or control lines to an XLDsl cryostat. In addition, more wiring lines can be added to multiple KF ports as a custom option.

-

Doubling up for growth

-
Reetta Kaila of Bluefors
-

Reciprocally, there’s significant commercial upside to this partnership. Bluefors is the quantum industry’s leading cryogenic systems OEM and, by extension, Delft Circuits now has access to the former’s established global customer base, amplifying its channels to market by orders of magnitude. “We have stepped into the big league here and, working together, we will ensure that Cri/oFlex® becomes a core enabling technology on the journey to quantum advantage,” notes Ferdinandus.

-

That view is amplified by Reetta Kaila, director for global technical sales and new products at Bluefors (and, alongside Ferdinandus, a main-mover behind the partnership). “Our market position in cryogenics is strong, so we have the ‘muscle’ and specialist know-how to integrate innovative technologies like Cri/oFlex® into our dilution refrigerators,” she explains.

-

A win-win, it seems, along several coordinates. “The Bluefors sales teams are excited to add Cri/oFlex® into the product portfolio,” Kaila adds. “It’s worth noting, though, that the collaboration extends across multiple functions – technical and commercial – and will therefore ensure close alignment of our respective innovation roadmaps.”

-

Scalable I/O will accelerate quantum innovation

-

Deconstructed, Delft Circuits’ value proposition is all about enabling, from an I/O perspective, the transition of quantum technologies out of the R&D lab into at-scale practical applications. More specifically: Cri/oFlex® technology allows quantum scientists and engineers to increase the I/O cabling density of their systems easily – and by a lot – while guaranteeing high gate fidelities (minimizing noise and heating) as well as market-leading uptime and reliability.

-

To put some hard-and-fast performance milestones against that claim, the company has published a granular product development roadmap that aligns Cri/oFlex® cabling specifications against the anticipated evolution of quantum computing systems –  from 150+ qubits today out to 40,000 qubits and beyond in 2029 (see figure below, “Quantum alignment”).

-

The resulting milestones are based on a study of the development roadmaps of more than 10 full-stack quantum computing vendors – a consolidated view that will ensure the “guiding principles” of Delft Circuits’ innovation roadmap align versus the aggregate quantity and quality of qubits targeted by the system developers over time.

-

delft circuits roadmap
-

-

-

-

The post Delft Circuits, Bluefors: the engine-room driving joined-up quantum innovation appeared first on Physics World.

-]]>
- Analysis -Technology partners will focus on scalable cryogenic I/O cabling assemblies for next-generation quantum computing systems -https://physicsworld.com/wp-content/uploads/2025/11/2025-11-delft-na-cables.jpg -newsletter
- - Microbubbles power soft, programmable artificial muscles - https://physicsworld.com/a/microbubbles-power-soft-programmable-artificial-muscles/ - Mon, 10 Nov 2025 09:30:18 +0000 - - - - https://physicsworld.com/?p=124917 - Ultrasound-activated microbubble arrays create flexible actuators for applications ranging from soft robotics to minimally invasive surgery

-

The post Microbubbles power soft, programmable artificial muscles appeared first on Physics World.

-]]>
- Ultrasound-powered soft surgical robot -

Artificial muscles that offer flexible functionality could prove invaluable for a range of applications, from soft robotics and wearables to biomedical instrumentation and minimally invasive surgery. Current designs, however, are limited by complex actuation mechanisms and challenges in miniaturization. Aiming to overcome these obstacles, a research team headed up at the Acoustic Robotics Systems Lab (ETH Zürich) in Switzerland is using microbubbles to create soft, programmable artificial muscles that can be wirelessly controlled via targeted ultrasound activation.

-

Gas-filled microbubbles can concentrate acoustic energy, providing a means to initiate movement with rapid response times and high spatial accuracy. In this study, reported in Nature, team leader Daniel Ahmed and colleagues built a synthetic muscle from a thin flexible membrane containing arrays of more than 10,000 microbubbles. When acoustically activated, the microbubbles generate thrust and cause the membrane to deform. And as different sized microbubbles resonate at different ultrasound frequencies, the arrays can be designed to provide programmable motion.

-

“Ultrasound is safe, non-invasive, can penetrate deep into the body and can generate large forces. However, without microbubbles, a much higher force is needed to deform the muscle, and selective activation is difficult,” Ahmed explains. “To overcome this limitation, we use microbubbles, which amplify force generation at specific sites and act as resonant systems. As a result, we can activate the artificial muscle at safe ultrasound power levels and generate complex motion.”

- -

The team created the artificial muscles from a thin silicone membrane patterned with an array of cylindrical microcavities with the dimensions of the desired microbubbles. Submerging this membrane in a water-filled acoustic chamber trapped tens of thousands of gas bubbles within the cavities (one per cavity). The final device contains around 3000 microbubbles per mm2 and weighs just 0.047 mg/mm2.

-

To demonstrate acoustic activation, the researchers fabricated an artificial muscle containing uniform-sized microbubbles on one surface. They fixed one end of the muscle and exposed it to resonant frequency ultrasound, simultaneously exciting the entire microbubble array. The resulting oscillations generated acoustic streaming and radiation forces, causing the muscle to flex upward, with an amplitude dependent upon the ultrasound excitation voltage.

-

Next, the team designed an 80 µm-thick, 3 x 0.5 cm artificial muscle containing arrays of three different sized microbubbles. Stimulation at 96.5, 82.3 and 33.2 kHz induced deformations in regions containing bubbles with diameters of 12, 16 and 66 µm, respectively. Exposure to swept-frequency ultrasound covering the three resonant frequencies sequentially activated the different arrays, resulting in an undulatory motion.

-
Microbubble-array artificial muscles
-

A multitude of functions

-

Ahmed and colleagues showcased a range of applications for the artificial muscle by integrating microbubble arrays into functional devices, such as a miniature soft gripper for trapping and manipulating fragile live animals. The gripper comprises six to ten microbubble array-based “tentacles” that, when subjected to ultrasound, gently gripped a zebrafish larva with sub-100 ms response time. When the ultrasound was switched off, the tentacles opened and the larva swam away with no adverse effects.

-

The artificial muscle can function as a conformable robotic skin that sticks and imparts motion to a stationary object, which the team demonstrated by attaching it to the surface of an excised pig heart. It can also be employed for targeted drug delivery – shown by the use of a microbubble-array robotic patch for ultrasound-enhanced delivery of dye into an agar block.

-

The researchers also built an ultrasound-powered “stingraybot”, a soft surgical robot with artificial muscles (arrays of differently sized microbubbles) on either side to mimic the pectoral fins of a stingray. Exposure to swept-frequency ultrasound induced an undulatory motion that wirelessly propelled the 4 cm-long robot forward at a speed of about 0.8 body lengths per second.

-

To demonstrate future practical biomedical applications, such as supporting minimally invasive surgery or site-specific drug release within the gastrointestinal tract, the researchers encapsulated a rolled up stingraybot within a 27 x 12 mm edible capsule. Once released into the stomach, the robot could be propelled on demand under ultrasound actuation. They also pre-folded a linear artificial muscle into a wheel shape and showed that swept ultrasound frequencies could propel it along the complex mucosal surfaces of the stomach and intestine.

- -

“Through the strategic use of microbubble configurations and voltage and frequency as ultrasound excitation parameters, we engineered a diverse range of preprogrammed movements and demonstrated their applicability across various robotic platforms,” the researchers write. “Looking ahead, these artificial muscles hold transformative potential across cutting-edge fields such as soft robotics, haptic medical devices and minimally invasive surgery.”

-

Ahmed says that the team is currently developing soft patches that can conform to biological surfaces for drug delivery inside the bladder. “We are also designing soft, flexible robots that can wrap around a tumour and release drugs directly at the target site,” he tells Physics World. “Basically, we’re creating mobile conformable drug-delivery patches.”

-

The post Microbubbles power soft, programmable artificial muscles appeared first on Physics World.

-]]>
- Research update -Ultrasound-activated microbubble arrays create flexible actuators for applications ranging from soft robotics to minimally invasive surgery -https://physicsworld.com/wp-content/uploads/2025/11/10-11-25-stingraybot-featured2.jpg -
- - China’s Shenzhou-20 crewed spacecraft return delayed by space debris impact - https://physicsworld.com/a/chinas-shenzhou-20-crewed-spacecraft-return-delayed-by-space-debris-impact/ - Fri, 07 Nov 2025 15:00:02 +0000 - - - - https://physicsworld.com/?p=124895 - Fears that the craft has been struck by a small piece of debris

-

The post China’s Shenzhou-20 crewed spacecraft return delayed by space debris impact appeared first on Physics World.

-]]>
- China has delayed the return of a crewed mission to the country’s space station over fears that the astronaut’s spacecraft has been struck by space debris. The craft was supposed to return to Earth on 5 November but the China Manned Space Agency says it will now carry out an impact analysis and risk assessment before making any further decisions about when the astronauts will return.

- -

The Shenzhou programme involves taking astronauts to and from China’s Tiangong space station, which was constructed in 2022, for six-month stays.

-

Shenzhou-20, carrying three crew, launched on 24 April from Jiuquan Satellite Launch Center on board a Long March 2F rocket. Once docked with Tiangong the three-member crew of Shenzhou-19 began handing over control of the station to the crew of Shenzhou-20 before they returned to Earth on 30 April.

-

The three-member crew of Shenzhou-21 launched on 31 October and underwent the same hand-over process with the crew of Shenzhou-20 before they were set to return to Earth on Wednesday.

-

Yet pre-operation checks revealed that the craft had been hit by “a small piece of debris” with the location and scale of the damage to Shenzhou-20 having not been released.

-

If the craft is deemed unsafe following the assessment, it is possible that the crew of Shenzhou-20 will return to Earth aboard Shenzhou-21. Another option is to launch a back-up Shenzhou spacecraft, which remains on stand-by and could be launched within eight days.

-

Space debris is of increasing concern and this marks the first time that a crewed craft has been delayed due to a potential space debris impact. In 2021, for example, China noted that Tiangong had to perform two emergency avoidance manoeuvres to avoid fragments produced by Starlink satellites that were launched by SpaceX.

- -

The post China’s Shenzhou-20 crewed spacecraft return delayed by space debris impact appeared first on Physics World.

-]]>
- News -Fears that the craft has been struck by a small piece of debris -https://physicsworld.com/wp-content/uploads/2025/11/china-space-07-11-2025.jpg -
- - Twistelastics controls how mechanical waves move in metamaterials - https://physicsworld.com/a/twistelastics-controls-how-mechanical-waves-move-in-metamaterials/ - Fri, 07 Nov 2025 13:57:32 +0000 - - - - https://physicsworld.com/?p=124886 - New technique could deliver reconfigurable phononic devices with myriad applications

-

The post Twistelastics controls how mechanical waves move in metamaterials appeared first on Physics World.

-]]>
- twisted surfaces can be used to manipulate mechanical waves -

By simply placing two identical elastic metasurfaces atop each other and then rotating them relative to each other, the topology of the elastic waves dispersing through the resulting stacked structure can be changed – from elliptic to hyperbolic. This new control technique, from physicists at the CUNY Advanced Science Research Center in the US, works over a broad frequency range and has been dubbed “twistelastics”. It could allow for advanced reconfigurable phononic devices with potential applications in microelectronics, ultrasound sensing and microfluidics.

-

The researchers, led by Andrea Alù, say they were inspired by the recent advances in “twistronics” and its “profound impact” on electronic and photonic systems. “Our goal in this work was to explore whether similar twist-induced topological phenomena could be harnessed in elastodynamics in which phonons (vibrations of the crystal lattice) play a central role,” says Alù.

- -

In twistelastics, the rotations between layers of identical, elastic engineered surfaces are used to manipulate how mechanical waves travel through the materials. The new approach, say the CUNY researchers, allows them to reconfigure the behaviour of these waves and precisely control them. “This opens the door to new technologies for sensing, communication and signal processing,” says Alù.

-

From elliptic to hyperbolic

-

In their work, the researchers used computer simulations to design metasurfaces patterned with micron-sized pillars. When they stacked one such metasurface atop the other and rotated them at different angles, the resulting combined structure changed the way phonons spread. Indeed, their dispersion topology went from elliptic to hyperbolic.

-

At a specific rotation angle, known as the “magic angle” (just like in twistronics), the waves become highly focused and begin to travel in one direction. This effect could allow for more efficient signal processing, says Alù, with the signals being easier to control over a wide range of frequencies.

-

The new twistelastic platform offers broadband, reconfigurable, and robust control over phonon propagation,” he tells Physics World. “This may be highly useful for a wide range of application areas, including surface acoustic wave (SAW) technologies, ultrasound imaging and sensing, microfluidic particle manipulation and on-chip phononic signal processing.

-

New frontiers

-

Since the twist-induced transitions are topologically protected, again like in twistronics, the system is resilient to fabrication imperfections, meaning it can be miniaturized and integrated into real-world devices, he adds. “We are part of an exciting science and technology centre called ‘New Frontiers of Sound’, of which I am one of the leaders. The goal of this ambitious centre is to develop new acoustic platforms for the above applications enabling disruptive advances for these technologies.”

- -

Looking ahead, the researchers say they are looking into miniaturizing their metasurface design for integration into microelectromechanical systems (MEMS). They will also be studying multi-layer twistelastic architectures to improve how they can control wave propagation and investigating active tuning mechanisms, such as electromechanical actuation, to dynamically control twist angles. “Adding piezoelectric phenomena for further control and coupling to the electromagnetic waves,” is also on the agenda says Alù.

-

The present work is detailed in PNAS.

-

The post Twistelastics controls how mechanical waves move in metamaterials appeared first on Physics World.

-]]>
- Research update -New technique could deliver reconfigurable phononic devices with myriad applications -https://physicsworld.com/wp-content/uploads/2025/11/7-11-25-twisting-layers.jpg -
- - Ternary hydride shows signs of room-temperature superconductivity at high pressures - https://physicsworld.com/a/ternary-hydride-shows-signs-of-room-temperature-superconductivity-at-high-pressures/ - Fri, 07 Nov 2025 09:00:33 +0000 - - - - https://physicsworld.com/?p=124851 - New alloy is made by doping scandium into the well-known La-H binary system

-

The post Ternary hydride shows signs of room-temperature superconductivity at high pressures appeared first on Physics World.

-]]>
- Crystal lattice structure of a new high-temperature superconductor -

Researchers in China claim to have made the first ever room-temperature superconductor by compressing an alloy of lanthanum-scandium (La-Sc) and the hydrogen-rich material ammonia borane (NH3BH3) together at pressures of 250–260 GPa, observing superconductivity with a maximum onset temperature of 298 K. While these high pressures are akin to those at the centre of the Earth, the work marks a milestone in the field of superconductivity, they say.

-

Superconductors conduct electricity without resistance and many materials do this when cooled below a certain transition temperature, Tc. In most cases this temperature is very low – for example, solid mercury, the first superconductor to be discovered, has a Tc of 4.2 K. Researchers have therefore been looking for superconductors that operate at higher temperatures – perhaps even at room temperature. Such materials could revolutionize a host of application areas, including increasing the efficiency of electrical generators and transmission lines through lossless electricity transmission. They would also greatly simplify technologies such as MRI, for instance, that rely on the generation or detection of magnetic fields.

-

Researchers made considerable progress towards this goal in the 1980s and 1990s with the discovery of the “high-temperature” copper oxide superconductors, which have Tc values between 30 and 133 K. Fast-forward to 2015 and the maximum known critical temperature rose even higher thanks to the discovery of a sulphide material, H3S, that has a Tc of 203 K when compressed to pressures of 150 GPa.

- -

This result sparked much interest in solid materials containing hydrogen atoms bonded to other elements and in 2019, the record was broken again, this time by lanthanum decahydride (LaH10), which was found to have a Tc of 250–260 K, albeit again at very high pressures. Then in 2021, researchers observed high-temperature superconductivity in the cerium hydrides, CeH9 and CeH10, which are remarkable because they are stable and boast high-temperature superconductivity at lower pressures (about 80 GPa, or 0.8 million atmospheres) than the other so-called “superhydrides”.

-

Ternary hydrides

-

In recent years, researchers have started turning their attention to ternary hydrides – substances that comprise three different atomic species rather than just two. Compared with binary hydrides, ternary hydrides are more structurally complex, which may allow them to have higher Tc values. Indeed, Li2MgH16 has been predicted to exhibit “hot” superconductivity with a Tc of 351–473 K under multimegabar pressures and several other high-Tc hydrides, including MBxHy, MBeH8 and Mg2IrH6-7, have been predicted to be stable under comparatively lower pressures.

-

In the new work, a team led by physicist Yanming Ma of Jilin University, studied LaSc2H24 – a compound that’s made by doping Sc into the well-known La-H binary system. Ma and colleagues had already predicted in theory – using the crystal structure prediction (CALYPSO) method – that this ternary material should feature a hexagonal P6/mmm symmetry. Introducing Sc into the La-H results in the formation of two novel interlinked H24 and H30 hydrogen clathrate “cages” with the H24 surrounding Sc and the H30 surrounding La.

-

The researchers predicted that these two novel hydrogen frameworks should produce an exceptionally large hydrogen-derived density of states at the Fermi level (the highest energy level that electrons can occupy in a solid at a temperature of absolute zero), as well as enhancing coupling between electrons and phonons (vibrations of the crystal lattice) in the material, leading to an exceptionally high Tc of up to 316 K at high pressure.

-

To characterize their material, the researchers placed it in a diamond-anvil cell, a device that generates extreme pressures as it squeezes the sample between two tiny, gem-grade crystals of diamond (one of the hardest substances known) while heating it with a laser. In situ X-ray diffraction experiments revealed that the compound crystallizes into a hexagonal structure, in excellent agreement with the predicted P6/mmm LaSc2H24 structure.

-

A key piece of experimental evidence for superconductivity in the La-Sc-H ternary system, says co-author Guangtao Liu, came from measurements that repeatedly demonstrated the onset of zero electrical resistance below the Tc.

-

Another significant proof, Liu adds, is that the Tc decreases monotonically with the application of an external magnetic field in a number of independently synthesized samples. “This behaviour is consistent with the conventional theory of superconductivity since an external magnetic field disrupts Cooper pairs – the charge carriers responsible for the zero-resistance state – thereby suppressing superconductivity.”

-

“These two main observations demonstrate the superconductivity in our synthesized La-Sc-H compound,” he tells Physics World.

-

Difficult experiments

-

The experiments were not easy, Liu recalls. The first six months of attempting to synthesize LaSc2H24 below 200 GPa yielded no obvious Tc enhancement. “We then tried higher pressure and above 250 GPa, we had to manually deposit three precursor layers and ensure that four electrodes (for subsequent conductance measurements) were properly connected to the alloy in an extremely small sample chamber, just 10 to 15 µm in size,” he says. “This required hundreds of painstaking repetitions.”

-

And that was not all: to synthesize the LaSc2H24, the researchers had to prepare the correct molar ratios of a precursor alloy. The Sc and La elements cannot form a solid solution because of their different atomic radii, so using a normal melting method makes it hard to control this ratio. “After about a year of continuous investigations, we finally used the magnetron sputtering method to obtain films of LaSc2H24 with the molar ratios we wanted,” Liu explains. “During the entire process, most of our experiments failed and we ended up damaging at least 70 pairs of diamonds.”

- -

Sven Friedemann of the University of Bristol, who was not involved in this work, says that the study is “an important step forward” for the field of superconductivity with a new record transition temperature of 295 K. “The new measurements show zero resistance (within resolution) and suppression in magnetic fields, thus strongly suggesting superconductivity,” he comments. “It will be exciting to see future work probing other signatures of superconductivity. The X-ray diffraction measurements could be more comprehensive and leave some room for uncertainty to whether it is indeed the claimed LaSc2H24 structure giving rise to the superconductivity.”

-

Ma and colleagues say they will continue to study the properties of this compound – and in particular, verify the isotope effect (a signature of conventional superconductors) or measure the superconducting critical current. “We will also try to directly detect the Meissner effect – a key goal for high-temperature superhydride superconductors in general,” says Ma. “Guided by rapidly advancing theoretical predictions, we will also synthesize new multinary superhydrides to achieve better superconducting properties under much lower pressures.”

-

The study is available on the arXiv pre-print server.

-

The post Ternary hydride shows signs of room-temperature superconductivity at high pressures appeared first on Physics World.

-]]>
- Research update -New alloy is made by doping scandium into the well-known La-H binary system -https://physicsworld.com/wp-content/uploads/2025/11/7-11-25-room-temperature-superconductor-featured.jpg -newsletter1