diff --git "a/raw_rss_feeds/https___www_livescience_com_feeds_all.xml" "b/raw_rss_feeds/https___www_livescience_com_feeds_all.xml" --- "a/raw_rss_feeds/https___www_livescience_com_feeds_all.xml" +++ "b/raw_rss_feeds/https___www_livescience_com_feeds_all.xml" @@ -10,8 +10,176 @@
The patient is one of six taking part in the first clinical trial of pig-to-human kidney transplants. The goal: to see whether gene-edited pig kidneys can safely replace failing human ones.
A decade ago, scientists were chasing a different solution. Instead of editing the genes of pigs to make their organs human-friendly, they tried to grow human organs — made entirely of human cells — inside pigs. But in 2015 the National Institutes of Health paused funding for that work to consider its ethical risks. The pause remains today.
As a bioethicist and philosopher who has spent years studying the ethics of using organs grown in animals — including serving on an NIH-funded national working group examining oversight for research on human-animal chimeras — I was perplexed by the decision. The ban assumed the danger was making pigs too human. Yet regulators now seem comfortable making humans a little more pig.
Why is it considered ethical to put pig organs in humans but not to grow human organs in pigs?
It's easy to overlook the desperation driving these experiments. More than 100,000 Americans are waiting for organ transplants. Demand overwhelms supply, and thousands die each year before one becomes available.
For decades, scientists have looked across species for help — from baboon hearts in the 1960s to genetically altered pigs today. The challenge has always been the immune system. The body treats cells it does not recognize as part of itself as invaders. As a result, it destroys them.
A recent case underscores this fragility. A man in New Hampshire received a gene-edited pig kidney in January 2025. Nine months later, it had to be removed because its function was declining. While this partial success gave scientists hope, it was also a reminder that rejection remains a central problem for transplanting organs across species, also known as xenotransplantation.
Researchers are attempting to work around transplant rejection by creating an organ the human body might tolerate, inserting a few human genes and deleting some pig ones. Still, recipients of these gene-edited pig organs need powerful drugs to suppress the immune system both during and long after the transplant procedure, and even this may not prevent rejection. Even human-to-human transplants require lifelong immunosuppressants.
That's why another approach — growing organs from a patient's own cells — looked promising. This involved disabling the genes that let pig embryos form a kidney and injecting human stem cells into the embryo to fill the gap where a kidney would be. As a result, the pig embryo would grow a kidney genetically matched to a future patient, theoretically eliminating the risk of rejection.
Although simple in concept, the execution is technically complex because human and pig cells develop at different speeds. Even so, five years prior to the NIH ban, researchers had already done something similar by growing a mouse pancreas inside a rat.
Cross-species organ growth was not a fantasy — it was a working proof of concept.
The worries motivating the NIH ban in 2015 on inserting human stem cells into animal embryos did not come from concerns about scientific failure but rather from moral confusion.
Policymakers feared that human cells might spread through the animal's body — even into its brain — and in so doing blur the line between human and animal. The NIH warned of possible "alterations of the animal's cognitive state." The Animal Legal Defense Fund, an animal advocacy organization, argued that if such chimeras gained humanlike awareness, they should be treated as human research subjects.
The worry centers on the possibility that an animal's moral status — that is, the degree to which an entity's interests matter morally and the level of protection it is owed – might change. Higher moral status requires better treatment because it comes with vulnerability to greater forms of harm.
Think of the harm caused by poking an animal that's sentient compared to the harm caused by poking an animal that's self-conscious. A sentient animal — that is, one capable of experiencing sensations such as pain or pleasure — would sense the pain and try to avoid it. In contrast, an animal that's self-conscious — that is, one capable of reflecting on having those experiences — would not only sense the pain but grasp that it is itself the subject of that pain. The latter kind of harm is deeper, involving not just sensation but awareness.
Thus, the NIH's concern is that if human cells migrate into an animal's brain, they might introduce new forms of experience and suffering, thereby elevating its moral status.

However, the reasoning behind the NIH's ban is faulty. If certain cognitive capacities, such as self-consciousness, conferred higher moral status, then it follows that regulators would be equally concerned about inserting dolphin or primate cells into pigs as they are about inserting human cells. They are not.
In practice, the moral circle of beings whose interests matter is drawn not around self-consciousness but around species membership. Regulators protect all humans from harmful research because they are human, not because of their specific cognitive capacities such as the ability to feel pain, use language or engage in abstract reasoning. In fact, many people lack such capacities. Moral concern flows from that relationship, not from having a particular form of awareness. No research goal can justify violating the most basic interests of human beings.
If a pig embryo infused with human cells truly became something close enough to count as a member of the human species, then current research regulations would dictate it's owed human-level regard. But the mere presence of human cells doesn't make pigs humans.
The pigs engineered for kidney transplants already carry human genes, but they aren't called half-human beings. When a person donates a kidney, the recipient doesn't become part of the donor's family. Yet current research policies treat a pig with a human kidney as if it might.
There may be good reasons to object to using animals as living organ factories, including welfare concerns. But the rationale behind the NIH ban that human cells could make pigs too human rests on a misunderstanding of what gives beings — and human beings in particular — moral standing.
This edited article is republished from The Conversation under a Creative Commons license. Read the original article.
]]>January's full moon, nicknamed the Wolf Moon, rises on Saturday, Jan. 3, as the second-highest full moon of the year. The moon turns full at precisely 5:03 a.m. EST and will also appear bright and full on Friday (Jan. 2) and Sunday (Jan. 4).
The full Wolf Moon is the last of four consecutive supermoons, after October's Harvest Moon, November's Beaver Moon and December's Cold Moon.
Supermoons occur when the full moon rises near perigee, its closest point to Earth in its elliptical orbit, making it appear bigger and brighter than a typical full moon. (By contrast, a micromoon occurs when the full moon coincides with apogee, its farthest point from Earth, making it appear smaller from our perspective.)
Here's how to photograph the moon when it's at its best.
In 2026, you'll have the chance to see 13 full moons, including three supermoons and two lunar eclipses (one of which is the last total lunar eclipse until New Year's Eve 2028). Although experienced moon gazers know that the night of the full moon is not the best for observing the lunar surface (even with a good pair of binoculars), the full moon rising as an orb at dusk is a celestial view that's hard to beat.
Here are all of the full moon dates and times for 2026, according to timeanddate.com, including the most commonly used names in North America:

There will be two lunar eclipses in 2026, but only one will be total. The first, on March 2-3, will be a total lunar eclipse, during which the full Worm Moon will drift through Earth's inner umbral shadow and turn a reddish-orange color for 58 minutes, from 6:04 to 7:02 a.m. EDT on March 3, according to timeanddate.com. The best views of this event, nicknamed a "blood moon," will be from western North America and the Asia Pacific.
The second lunar eclipse, on Aug. 27-28, will be a partial lunar eclipse, during which 96% of the Sturgeon Moon will enter Earth's inner umbral shadow and may take on a reddish-orange hue near maximum eclipse at 12:12 a.m. EDT on Aug. 28, according to timeanddate.com. The best views will be from North and South America, Europe and Africa.

Scientists typically break the moon's 29.5-day cycle into eight phases, which are determined by the relative positions of the moon, Earth and the sun.
The start of the cycle is the new moon, which is when the moon is exactly between Earth and the sun. We cannot see the moon when it's in the new phase because no sunlight is reflected from its Earth-facing side. A new moon is the only time when a solar eclipse is possible. Two central solar eclipses will occur in 2026: an annular solar eclipse on Feb. 17 and a total solar eclipse on Aug. 12.
As more sunlight hits the moon's Earth-facing side, we say the moon is waxing. The next phase of the moon is called a waxing crescent, followed by the first-quarter phase. Half of the moon's visible surface appears illuminated during the first quarter.
Next comes the waxing gibbous moon, which is partway between a first-quarter moon and a full moon. Halfway through the lunar cycle, the full moon rises, and the moon shines bright and large in the sky. During this phase, the moon and the sun are on opposite sides of Earth, and the entire Earth-facing side of the moon is illuminated.
After the full moon, the waning cycle begins — first with the waning gibbous phase, then a last-quarter moon and, finally, a waning crescent. After almost 30 days, the moon becomes "new" again, and the cycle repeats.
]]>Much of the discussion about the melting of massive ice sheets during a time of climate change addresses its effects on people. That makes sense: Millions will see their homes damaged or destroyed by rising sea levels and storm surges.
But what will happen to Antarctica itself as the ice sheets melt?
In layers of sediment accumulated on the sea floor over millions of years, researchers like us are finding evidence that when West Antarctica melted, there was a rapid uptick in onshore geological activity in the area. The evidence foretells what’s in store for the future.
As far back as 30 million years ago, an ice sheet covered much of what we now call Antarctica. But during the Pliocene Epoch, which lasted from 5.3 million to 2.6 million years ago, the ice sheet on West Antarctica drastically retreated. Rather than a continuous ice sheet, all that remained were high ice caps and glaciers on or near mountaintops.
About 5 million years ago, conditions around Antarctica began to warm, and West Antarctic ice diminished. About 3 million years ago, all of Earth entered a warm climate phase, similar to what is happening today.
Glaciers are not stationary. These large masses of ice form on land and flow toward the sea, moving over bedrock and scraping off material from the landscape they cover, and carrying that debris along as the ice moves, almost like a conveyor belt. This process speeds up when the climate warms, as does calving into the sea, which forms icebergs. Debris-laden icebergs can then carry that continental rock material out to sea, dropping it to the sea floor as the icebergs melt.

In early 2019, we joined a major scientific trip – International Ocean Discovery Program Expedition 379 – to the Amundsen Sea, south of the Pacific Ocean. Our expedition aimed to recover material from the seabed to learn what had happened in West Antarctica during its melting period all that time ago.
Aboard the drillship JOIDES Resolution, workers lowered a drill nearly 13,000 feet (3,962 meters) to the sea floor and then drilled 2,605 feet (794 meters) into the ocean floor, directly offshore from the most vulnerable part of the West Antarctic ice sheet.
The drill brought up long tubes called “cores,” containing layers of sediments deposited between 6 million years ago and the present. Our research focused on sections of sediment from the time of the Pliocene Epoch, when Antarctica was not entirely ice-covered.

While onboard, one of us, Christine Siddoway, was surprised to discover an uncommon sandstone pebble in a disturbed section of the core. Sandstone fragments were rare in the core, so the pebble’s origin was of high interest. Tests showed that the pebble had come from mountains deep in the Antarctic interior, roughly 800 miles (1,300 kilometers) from the drill site.
For this to have happened, icebergs must have calved from glaciers flowing off interior mountains and then floated toward the Pacific Ocean. The pebble provided evidence that a deep-water ocean passage – rather than today’s thick ice sheet – existed across the interior of what is now Antarctica.
After the expedition, once the researchers returned to their home laboratories, this finding was confirmed by analyzing silt, mud, rock fragments, and microfossils that also came up in the sediment cores. The chemical and magnetic properties of the core material revealed a detailed timeline of the ice sheet’s retreats and advances over many years.

One key sign came from analyses led by Keiji Horikawa. He tried to match thin mud layers in the core with bedrock from the continent, to test the idea that icebergs had carried such materials very long distances. Each mud layer was deposited right after a deglaciation episode, when the ice sheet retreated, that created a bed of iceberg-carried pebbly clay. By measuring the amounts of various elements, including strontium, neodymium and lead, he was able to link specific thin layers of mud in the drill cores to chemical signatures in outcrops in the Ellsworth Mountains, 870 miles (1400 km) away.
Horikawa discovered not just one instance of this material but as many as five mud layers deposited between 4.7 million and 3.3 million years ago. That suggests the ice sheet melted and open ocean formed, then the ice sheet regrew, filling the interior, repeatedly, over short spans of thousands to tens of thousands of years.
Teammate Ruthie Halberstadt combined this chemical evidence and timing in computer models showing how an archipelago of ice-capped, rugged islands emerged as ocean replaced the thick ice sheets that now fill Antarctica’s interior basins.
The biggest changes happened along the coast. The model simulations show a rapid increase in iceberg production and a dramatic retreat of the edge of the ice sheet toward the Ellsworth Mountains. The Amundsen Sea became choked with icebergs produced from all directions. Rocks and pebbles embedded in the glaciers floated out to sea within the icebergs and dropped to the seabed as the icebergs melted.
Long-standing geological evidence from Antarctica and elsewhere around the world shows that as ice melts and flows off the land, the land itself rises because the ice no longer presses it down. That shift can cause earthquakes, especially in West Antarctica, which sits above particularly hot areas of the Earth’s mantle that can rebound at high rates when the ice above them melts.
The release of pressure on the land also increases volcanic activity – as is happening in Iceland in the present day. Evidence of this in Antarctica comes from a volcanic ash layer that Siddoway and Horikawa identified in the cores, formed 3 million years ago.
The long-ago loss of ice and upward motions in West Antarctica also triggered massive rock avalanches and landslides in fractured, damaged rock, forming glacial valley walls and coastal cliffs. Collapses beneath the sea displaced vast amounts of sediment from the marine shelf. No longer held in place by the weight of glacier ice and ocean water, huge masses of rock broke away and surged into the water, producing tsunamis that unleashed more coastal destruction.
The rapid onset of all these changes made deglaciated West Antarctica a showpiece for what has been called “catastrophic geology.”
The rapid upswell of activity resembles what has happened elsewhere on the planet in the past. For instance, at the end of the last Northern Hemisphere ice age, 15,000 to 18,000 years ago, the region between Utah and British Columbia was subjected to floods from bursting glacial meltwater lakes, land rebound, rock avalanches and increased volcanic activity. In coastal Canada and Alaska, such events continue to occur today.
Our team’s analysis of rocks’ chemical makeup makes clear that West Antarctica doesn’t necessarily undergo one gradual, massive shift from ice-covered to ice-free, but rather swings back and forth between vastly different states. Each time the ice sheet disappeared in the past, it led to geological mayhem.
The future implication for West Antarctica is that when its ice sheet next collapses, the catastrophic events will return. This will happen repeatedly, as the ice sheet retreats and advances, opening and closing the connections between different areas of the world’s oceans.
This dynamic future may bring about equally swift responses in the biosphere, such as algal blooms around icebergs in the ocean, leading to an influx of marine species into newly opened seaways. Vast tracts of land upon West Antarctic islands would then open up to growth of mossy ground cover and coastal vegetation that would turn Antarctica more green than its current icy white.
Our data about the Amundsen Sea’s past and the resulting forecast indicate that onshore changes in West Antarctica will not be slow, gradual or imperceptible from a human perspective. Rather, what happened in the past is likely to recur: geologically rapid shifts that are felt locally as apocalyptic events such as earthquakes, eruptions, landslides and tsunamis – with worldwide effects.
This edited article is republished from The Conversation under a Creative Commons license. Read the original article.
]]>We're closer than ever to achieving functional cures for once-intracable diseases, including HIV. Stem-cell treatments are repairing blinding eye damage and stabilizing failing hearts. Emerging cancer treatments promise to extend patients' lives and decrease the likelihood that their disease will return. And cutting-edge treatments are sparing children from devastating genetic diseases.
But even as the promise of decades of medical research is being realized, the foundations of the field are coming under attack. Can these emerging treatments really save us from harm when long-standing tenets of public health and medical research are being eroded before our eyes?
That's the question circling my brain as I look ahead to 2026. While I'd love to focus solely on how far we've come, it's impossible to ignore the ground we've lost in recent months.
As anticipated, 2025 was a breakthrough year for gene therapy, and I expect 2026 to bring more exciting developments in the field.
KJ Muldoon, a baby born with a rare genetic disease, became the first person to receive a customized CRISPR treatment. The two CRISPR-based therapies approved to date are one-size-fits-all, and they require cells to be removed, edited in a lab, and then reintroduced into the body. KJ's treatment, by contrast, was made to tweak a specific mutation in his cells, and the editing took place inside his body.
One of KJ's doctors told me that they're now working with the Food and Drug Administration to make these bespoke treatments easier for patients to access, so hopefully, more people will benefit from such therapies in the coming months. (Notably, though, baby KJ's treatment used mRNA — a molecule that also formed the basis of the first COVID-19 vaccines. The federal government is retreating from mRNA vaccines, but other uses of the technology may be spared.)
In the meantime, scientists are trialing a gene therapy for Huntington's disease that may slow its progression — a feat never realized with any existing treatment. A CRISPR treatment for high cholesterol is making its way through trials, as is a gene therapy for congenital deafness and a new cancer therapy that involves base editing immune cells. And in preclinical research, scientists are developing new gene-editing systems that could someday enable "mutation agnostic" treatments that work for many people, as a complement to therapies that correct very specific mutations.
This year, we also saw results from a U.K.-based clinical trial of "mitochondrial donation," a technique that's been in the works for years and is finally being tested in people. The approach, done in the context of in vitro fertilization, aims to prevent mothers who carry harmful mutations in their mitochondrial DNA from passing those mutations to their kids. In the early trial, the approach appeared to be successful, and I'll be interested to see how the research proceeds.
We've also seen GLP-1s — Ozempic and other drugs in the same class — become more commonplace, and we've been learning about their potential benefits beyond weight loss and blood-sugar control. There are early signals that these drugs may help treat migraine, alcohol use disorder and heart failure, for instance. I expect these findings will spur interesting research into the underlying relationship between these conditions and metabolism.
That said, I don't think the drugs will be a silver bullet for all diseases — they just failed in a hotly anticipated Alzheimer's disease trial, for example. Nonetheless, research on GLP-1s may uncover previously unappreciated drivers of disease that could be tackled by other means in the future.
I'll also be keen to follow emerging research on senolytics — drugs that clear senescent, or biologically aged, cells from the body.
Xenotransplantation — the transplantation of animal organs into humans — continues to progress by leaps and bounds as experiments and trials with humans unfold around the world.
And as research increasingly reveals the role of viruses in dementia, I expect the next few years of studies could fundamentally rewrite our understanding of neurodegenerative disease and how to treat it.
From a technological and research standpoint, there's a lot to be excited about. But the horizon looks darker when you cast your eyes to the realm of public health and the systems that fund and regulate research and new drugs, at least in the United States.
President Trump's second administration ushered controversial new appointments across the country's leading health agencies — as well as deep budget cuts. Vaccine and medical-establishment skeptic Robert F. Kennedy Jr., now at the helm of the Department of Health and Human Services, spearheaded dramatic changes across its divisions, such as the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC).
The NIH has signaled that it's deemphasizing the practice of studying both sexes. Given that females are understudied at baseline, experts worry that such a move will widen existing knowledge gaps. The agency's leadership has also argued that collecting demographic data — on study participants' race, ethnicity or gender identity — should be avoided except in circumstances deemed "scientifically justified," a phrase with no clear definition.
In the next year, I expect these moves to derail research aimed at understanding health disparities and improving care for marginalized and understudied populations. Disrupting this research today means prolonging these disparities in the future.
Former CDC leaders have reported witnessing a profound disconnect between RFK Jr. and the agency's scientific staff, a lack of strategy surrounding policy changes, and a dismissal of established research findings. Meanwhile, a new vaccine advisory committee handpicked by RFK Jr. has cast doubt on the well-established childhood vaccine schedule.
Some recent committee decisions have been more confusing than directly consequential, such as those regarding the measles vaccine and the COVID-19 vaccines. Still, set against the backdrop of RFK Jr. broadly undermining trust in vaccines, even these changes could lower vaccination rates in a country already poised to lose its measles elimination status. And other committee decisions, such as recommendations to delay hepatitis B vaccination for newborns, have the potential to cause direct and significant harm right away.

As the CDC is dismantled and its career scientists are ignored, devalued or fired, I anticipate further holes to be poked in the nation's public-health safety net in 2026. Some decisions may primarily stoke confusion and mistrust around established medical practice. Others may bar access to care by revoking federal insurance coverage or withholding reimbursement to hospitals that provide certain types of care.
The exact impacts of forthcoming changes will likely be piecemeal, varying from state to state, similar to how we've seen abortion access splinter in the wake of Roe v. Wade's overturn. But nationwide, it's fair to expect upticks in vaccine-preventable disease.
For trustworthy health guidance, I would recommend sources such as the American Academy of Pediatrics, the American College of Obstetricians and Gynecologists and other professional medical associations; the independent health-policy resource KFF; and the University of Minnesota's Center for Infectious Disease Research and Policy, including its Vaccine Integrity Project. Local health departments and regional coalitions, such as the West Coast Health Alliance, should also help fill the information gap left by federal agencies.
But given that the average person is already bombarded with conflicting health guidance — especially online — I'm concerned that the loss of centralized sources of science-backed information will ultimately put more people at risk of preventable disease.
The Trump administration also shuttered the U.S. Agency for International Development (USAID) this year, pushing a handful of the agency's prior functions under the Department of State.
USAID, previously the world's largest foreign aid agency, had programs aimed at combating infectious diseases like HIV and tuberculosis, reducing malnutrition, cleaning water systems, and bolstering maternal health care around the world. Its loss left governments and organizations scrambling to make up the funding shortfall, but they likely won't be able to fill the gap completely, stakeholders have warned. Even if they do, delays in funding still mean delays in care, which can be deadly.
Prior to USAID's closure, experts worldwide were cautiously optimistic about bringing an end to the HIV epidemic by 2030. Now, models suggest that the loss of the agency could usher millions more HIV cases and deaths in low- and middle-income countries than anticipated over the next five years. Looking beyond HIV to all of USAID's former programs, estimates suggest that the closures have already contributed to hundreds of thousands of deaths from infectious diseases and malnutrition worldwide.
The U.S. is not immune to the ripple effects of USAID's dissolution.
"One of USAID's most critical functions is to fight the spread of infectious diseases that have the capacity to spark a global pandemic," Dr. Chris Beyrer, an epidemiologist and director of the Duke Global Health Institute, wrote for Live Science in March. "While much of this work is carried out far from the U.S., infectious diseases know no borders, and we have seen countless instances of viruses that arise in one part of the world but quickly find their way to other countries."
This, to me, underscores a key point about public health: It's a group project. Improving conditions for those most vulnerable to disease benefits everyone in the long run, not only by reducing suffering and saving lives but by cutting health care costs and bolstering economies. One could say the same about the efforts to curb climate change and environmental pollution — efforts that the current administration is also repudiating.
I look forward to following the development of groundbreaking medical treatments over the upcoming year. These emerging technologies promise to alleviate the suffering of individual patients — if they can access them. But even as we celebrate those accomplishments, I worry that their benefits simply won't reach a huge portion of the populace.
Headlines about the next great gene therapy will run alongside news of rising infection rates and deadlier climate-driven disasters. Early data hint that senolytics could help stave off age-related diseases — but even as those drugs get developed, falling vaccination rates mean we could return to a time when a lot more people die in childhood than have in recent decades.
My hope for 2026 is that the scientists and stakeholders still committed to protecting public health will persevere and find ever-expanding support so that everyone can reap the benefits of medical science.
This article is for informational purposes only and is not meant to offer medical advice.
]]>Although it's not as famous as August's Perseids or December's Geminids, January's Quadrantids can be just as prolific. This year, they will be active from Dec. 28 through Jan. 12 and will peak on Jan. 3 starting around 4 p.m. EST (21:00 UTC).
During that peak, about 25 "shooting stars" are likely per hour. However, because this meteor shower coincides with the full moon this year, the sky will be relatively bright. That will make faint meteors harder to see, with around 10 per hour likely to be visible.
It's a narrow peak, lasting about six hours, so North American skywatchers should start looking as soon as it gets dark. Although shooting stars from the Quadrantids tend to be relatively faint, they can often produce bright "fireballs."
Quadrantids can be seen anywhere in the night sky, but they appear to come from the northern sky — specifically, the constellation Boötes, part of which was formerly called Quadrans Muralis (hence the name of this meteor shower). The best way to visualize this radiant origin point for the Quadrantids is to look at the night sky around the handle of the famous Big Dipper asterism.
The Quadrantid meteor shower happens when, each January, Earth travels through a narrow stream of dust and debris orbiting the sun. The stream is thought to come from an object called 2003 EH, which may be an asteroid or an extinct comet and takes 5.5 years to orbit the sun from around the same distance as Earth (but safely beyond it), according to EarthSky.
The next notable meteor shower wil be the Lyrids in April. When the Lyrid meteor shower peaks on the night of April 21-22 during a crescent moon, the sky conditions will be ideal for seeing about 18 shooting stars per hour.
To maximize the number of meteors you'll see during either event, find a location with a clear view of as much of the night sky as possible. The bright moon during the Quadrantids will make it pointless to attempt escaping to dark skies, but try to keep the moon behind you to maximize your chances of spotting shooting stars.
]]>Supporters of planetary colonization argue that becoming a multi-planet species could safeguard us from potentially Earth-ending events. However, it will require an enormous effort to colonize another planet or moon. And if we look beyond Mars, potentially habitable planets may take thousands of years to reach.
But as technology advances and space agencies consider long-term human settlements on other planets, a more fundamental issue now beckons — not whether we can expand to other worlds, but whether we should.
What's your take? Answer our poll below and share the reasoning behind your choice in the comments.
—Alcohol-soaked star system could help explain 'why life, including us, was able to form'
—There may be hundreds of millions of habitable planets in the Milky Way, new study suggests
—'Eyeball' planet spied by James Webb telescope might be habitable
]]>Based on several photos of the head, a researcher spotted that the individual was born with a cleft lip, Beth Scaffidi, an assistant professor of anthropology and heritage studies at the University of California, Merced, wrote in a new study.
Cleft lip and cleft palate are related conditions in which babies are born with a gap in the lip and/or the roof of their mouth. They are among the most common birth defects, occurring in approximately 1 in 700 live births globally in modern times, research suggests. But diagnosing orofacial clefts (the umbrella term for cleft lip and palate) in archaeological remains is rare, with only around 50 cases identified to date around the world, according to one study.
The latest study, published Nov. 3 in the journal Ñawpa Pacha, is the first time an orofacial cleft has been documented in an Andean trophy head, offering a "unique opportunity" to explore how ancient peoples of the region viewed such conditions, Scaffidi wrote in the paper.
"This finding is important because it shows that people survived, and even thrived, with this condition in the ancient Andes," Scaffidi told Live Science in an email. "It helps show that what we define as a disability and how we respond to it is culturally, rather than biologically, determined."
For millenia, ancient peoples in parts of the Andes mountains in South America, as well as the surrounding regions, collected severed heads as trophies, processing them for preservation and display purposes, Scaffidi said. Most known examples date to between roughly 300 B.C. and A.D. 800, often originating from around modern-day Nazca in coastal Peru's Ica department (Peru has 24 departments, or regions). Trophy heads were likely passed down as heirlooms through generations, Scaffidi said.
"Most trophies were mummified naturally in the arid desert environment, and many preserve hair and flesh," Scaffidi said. "We still debate whether these heads were lovingly curated remains of beloved ancestors or souvenirs of violent conquest of enemies, but many do also display violent injuries received before and around the time of death."

During a research project, Scaffidi came across an intriguing example in the catalog of the Museum of Modern and Contemporary Art in Saint-Etienne, France, purportedly originating from the Ica department.
Scaffidi examined photographs of the mummified head and determined that the individual was probably male and a young adult at the time of death. Based on the visible facial structures, she diagnosed the individual with a cleft lip.
Perhaps the most serious complication of orofacial clefts is difficulty latching during breastfeeding, but these conditions can also cause respiratory, hearing and speech issues, she wrote in the study. Today, these birth defects are typically treated with surgery in the first few months of life, but in the ancient Andean world they would have presented a significant challenge to mothers and caregivers during the baby's infancy. The individual Scaffidi studied, for example, would likely have required specialized care to receive nourishment as an infant.
But not only did this individual survive into early adulthood, it is possible that their condition even granted them special status, Scaffidi said. Cultural responses to orofacial clefts in the ancient Americas varied widely, from shame to veneration. But taking into account what is known about the worldviews of ancient Andean peoples specifically, it is likely this individual was perceived as sacred and afforded a high-status role throughout life and beyond, Scaffidi said.
In the absence of documentary or textual sources, ancient ceramic vessels from the region — particularly those produced by the Moche culture (A.D. 200 to 850) of northern Peru — provide clues as to how congenital conditions might have been understood at the time.
In the study, Scaffidi found 30 ceramic representations of orofacial clefts that had already been documented from the wider Andean region, 20 of which were from Moche areas. These examples mostly depict males adorned in elite jewelry, head wrappings, or performing shamanic or medical activities, suggesting that they were people of importance. Other research suggests that the Moche believed facial markings protected them from supernatural harm, thus, birth defect markings were revered.
Previous research has indicated that Andean trophy heads were often collected from individuals perceived as having supernatural powers, with the belief that the people taking the heads could use this power to benefit their own communities.
The collection of this particular subject as a trophy head, even through potentially violent means, is consistent with the idea that orofacial clefts were celebrated by the takers, Scaffidi said. What may be seen today as a disability was likely considered a "blessing," according to the study.
]]>In many earthquakes, the subsurface moves more than the surface. But the quake on the Sagaing fault was different because the surface moved just as much as the rocks miles deep, a new study shows. This was likely because the Saigang Fault dates back to between 14 million and 28 million years ago.
"Over that vast time, the rough edges and bends in the fault have been ground down," first author Eric Lindsey, a geoscientist at the University of New Mexico, said in a statement. "Because it is so smooth and straight, the earthquake rupture could travel very efficiently across a huge distance."
When the magnitude 7.7 quake hit on March 28, it ruptured about 300 miles (500 kilometers) of ground — a remarkably long surface rupture. Typically, Lindsey said, earthquake ruptures are more on the order of 19 to 37 miles (30 to 60 km). This rupture came with very severe shaking, and more than 5,400 people died.
Because of the infrastructure damage from the quake and ongoing armed conflict in Myanmar, Lindsey and his colleagues turned to satellite imagery to study the event. They used both optical imagery and radar data from the European Space Agency's Sentinel-2 satellites to track ground motion down to a fraction of an inch.
Their findings, published Dec. 8 in the journal Nature Communications, showed that the earthquake was very efficient in transferring its energy up to the surface. Quakes originate deep underground. In the case of the Myanmar quake, the rupture started 6 miles (10 km) or so deep. Most of the time, the underground movement doesn't entirely transfer to the surface — a phenomenon called "shallow slip deficit." (Slip is the movement of one side of the fault against the other.) In the Myanmar quake, there was no shallow slip deficit.
"The massive amount of slip that happened miles underground was transferred 100% to the surface," Lindsey said.
The ground surface on one side of the fault moved 10 to 15 feet (3 to 4.5 meters) in relation to the other. This movement was even caught on camera in a first-of-its-kind video.
Because of the efficiency of the energy transfer from deep underground to the surface, a quake on a mature fault like the one that hit Myanmar may cause more ground shaking than a quake on a more jagged fault line, Lindsey explained.
"The significance lies in safety," he said. "This earthquake showed us that mature faults can be much more efficient at transmitting energy to the surface than younger ones, which has direct implications for how we build infrastructure to withstand the 'Big One' in the United States."
]]>The symptoms: A man went to the dermatology department of a hospital after experiencing an array of symptoms for about three years. When he submerged his hands in water, such as during hand-washing, the skin on the back of the man's hands thickened and became overly wrinkly, with white bumps and growths appearing.
Whenever this occurred, his hands felt very itchy and like they were burning, and the symptoms were worse in the summer months, he told doctors. Symptoms did not arise during winter, and his palms were unaffected year-round.
The man had previously sought treatment at his local clinic, where he was diagnosed with chronic eczema — which causes skin to become dry, thick and itchy — and prescribed a strong retinoid ointment, which he used intermittently. However, this treatment didn't work, and his symptoms gradually worsened. His wrists and elbows had also started to develop the skin lesions over the 1.5 years prior to the hospital visit.
The patient had no family history of similar skin conditions and did not experience excessive sweating or have any allergies, and he had never injured his hands. The man attributed the worsening of his condition to him washing his hands more frequently during the COVID-19 pandemic, the doctors wrote in the report of the case.
What happened next: During a physical examination at the hospital, the man's hands were immersed in water for 10 minutes, immediately causing the tops of his hands, fingers and wrists to grow red, scaly and wrinkly with white lesions. Notably, the "excessive wrinkling" and bumps ended in a straight line on the sides of his hands, leaving his palms unaffected.
The doctors took biopsies from the white bumps on his right hand, which revealed that the sweat ducts in the top layer of skin had widened and contained more sweat glands than normal. The results also showed he had hyperkeratosis, meaning his body was producing too much of the protein keratin, causing the outer layer of skin to thicken.
The medical team wrote that "the patient's clinical process was quite interesting," because the symptoms of the skin condition appeared only after his hands were immersed in water and all symptoms disappeared around 30 minutes after his hands dried.

The diagnosis: The doctors diagnosed the patient with a condition called aquagenic syringeal acrokeratoderma (ASA) based on how his symptoms appeared in the clinic and the results of the biopsy. In almost all other cases, however, it affects the palms of the hands, not the backs of the hands or fingers.
The short-lived symptoms of this skin disease are known as the "hand in the bucket sign," because they occur after the hands are submerged in water. The symptoms normally disappear within a few hours of drying, but a subset of people with ASA have persistent skin lesions that are aggravated by water exposure, according to the Genetic and Rare Diseases Information Center (GARD).
The cause of ASA is currently unknown, but it may be linked to "an acquired sweat gland abnormality" or some trigger that causes thickening of the skin, according to research published in the Journal of the American Academy of Dermatology.
The treatment: The patient was treated with topical hydrocortisone urea ointment, which is a corticosteroid and skin moisturizer that can be applied directly to the affected area. It is typically used to treat skin irritation, swelling and redness.
The doctors also recommended that the patient avoid getting his hands wet more than what was strictly necessary. The man was still attending follow-up appointments when the doctors wrote about his case, and they noted that his symptoms had eased substantially after just one month.
What makes the case unique: ASA is thought to be a rare condition, although its exact prevalence is unknown.
Data suggests the condition is most common in female adolescents, the case report authors noted. It also occurs in about 40% to 84% of cystic fibrosis (CF) patients and carriers, meaning people who have just one copy of the CF gene mutation, according to GARD. (You need two copies to develop CF.) This pattern hints at ASA being caused in part by mutations in that gene, at least in some cases.
The patient described in this case was the first known to have ASA that didn't affect his palms, the doctors wrote in the report. It's unclear why his case manifested differently than others previously reported.
For more intriguing medical cases, check out our Diagnostic Dilemma archives.
This article is for informational purposes only and is not meant to offer medical advice.
]]>Three decades later, another new technology has unleashed another wave of exuberance. Investors are pouring billions into any company with "AI" in its name. But there is a crucial difference between these two bubbles, which isn't always recognised. The World Wide Web existed. It was real. General Artificial Intelligence does not exist, and no one knows if or when it ever will.
In February, the CEO of OpenAI, Sam Altman, wrote on his blog that the very latest systems have only just started to "point towards" AI in its "general" sense. OpenAI may market its products as "AIs," but they are merely statistical data-crunchers, rather than "intelligences" in the sense that human beings are intelligent.
So why are investors so keen to give money to the people selling AI systems? One reason might be that AI is a mythical technology. I don't mean it is a lie. I mean it evokes a powerful, foundational story of Western culture about human powers of creation.
Perhaps investors are willing to believe AI is just around the corner because it taps into myths that are deeply ingrained in their imaginations?
The most relevant myth for AI is the Ancient Greek myth of Prometheus.
There are many versions of this myth, but the most famous are found in Hesiod'spoems Theogony and Works and Days, and in the play Prometheus Bound, traditionally attributed to Aeschylus.
Prometheus was a Titan, a god in the Ancient Greek pantheon. He was also a criminal who stole fire from Hephaestus, the blacksmith god. Hiding the fire in a stalk of fennel, Prometheus came to earth and gave it to humankind. As punishment, he was chained to a mountain, where an eagle visited every day to eat his liver.
Prometheus' gift was not simply the gift of fire; it was the gift of intelligence. In Prometheus Bound, he declares that before his gift humans saw without seeing and heard without hearing. After his gift, humans could write, build houses, read the stars, perform mathematics, domesticate animals, construct ships, invent medicines, interpret dreams and give proper offerings to the gods.
The myth of Prometheus is a creation story with a difference. In the Hebrew Bible, God does not give Adam the power to create life. But Prometheus gives (some of) the gods' creative power to humankind.
Hesiod indicates this aspect of the myth in Theogony. In that poem, Zeus not only punishes Prometheus for the theft of fire; he punishes humankind as well. He orders Hephaestus to fire up his forge and construct the first woman, Pandora, who unleashes evil on the world.
The fire that Hephaestus uses to make Pandora is the same fire that Prometheus has given humankind.

The Greeks proposed the idea that humans are a form of artificial intelligence. Prometheus and Hephaestus use technology to manufacture men and women. As historian Adrienne Mayor reveals in her book Gods and Robots, the ancients often depicted Prometheus as a craftsman, using ordinary tools to create human beings in an ordinary workshop.
If Prometheus gave us the fire of the gods, it would seem to follow that we can use this fire to make our own intelligent beings. Such stories abound in Ancient Greek literature, from the inventor Daedalus, who created statues that came to life, to the witch Medea, who could restore youth and potency with her cunning drugs. Greek inventors also constructed mechanical computers for astronomy and remarkable moving figures powered by gravity, water and air.
2,700 years have passed since Hesiod first wrote down the story of Prometheus. In the ensuing centuries, the myth has been endlessly retold, especially since the publication of Mary Shelley's Frankenstein; or the Modern Prometheus in 1818.
But the myth is not always told as fiction. Here are two historical examples where the myth of Prometheus seemed to come true.
Gerbert of Aurillac was the Prometheus of the 10th century. He was born in the early 940s CE, went to school at Aurillac Abbey, and became a monk himself. He proceeded to master every known branch of learning. In the year 999, he was elected Pope. He died in 1003 under his pontifical name, Sylvester II.
Rumours about Gerbert spread wildly across Europe. Within a century of his death, his life had already become legend. One of the most famous legends, and the most pertinent in our age of AI hype, is that of Gerbert's "brazen head." The legend was told in the 1120s by the English historian William of Malmesbury, in his well researched and highly regarded book, Deeds of the English Kings.
Gerbert was deeply learned in astronomy, a science of prediction. Astronomers could use the astrolabe to predict the position of the stars and foresee cosmological events such as eclipses. According to William, Gerbert used his knowledge of astronomy to construct a talking head. After inspecting the movements of the stars and planets, he cast a head in bronze that could answer yes-or-no questions.
First Gerbert asked the head: "Will I become Pope?"
"Yes," answered the head.
Then Gerbert asked: "Will I die before I sing mass in Jerusalem?"
"No," the head replied.
In both cases, the head was correct, though not as Gerbert anticipated. He did become Pope, and he sensibly avoided going on pilgrimage to Jerusalem. One day, however, he sang mass at Santa Croce in Gerusalemme in Rome. Unfortunately for Gerbert, Santa Croce in Gerusalemme was known in those days simply as "Jerusalem."
Gerbert sickened and died. On his deathbed, he asked his attendants to cut up his body and cast away the pieces, so he could go to his true master, Satan. In this way, he was, like Prometheus, punished for his theft of fire.

It is a thrilling story. It is not clear whether William of Malmesbury actually believed it. But he does try to persuade his readers that it is plausible. Why did this great historian with a devotion to the truth insert some fanciful legends about a French pope into his history of England? Good question!
Is it so fanciful to believe that an advanced astronomer might build a general-purpose prediction machine? In those days, astronomy was the most powerful science of prediction. The sober and scholarly William was at least willing to entertain the idea that brilliant advances in astronomy might make it possible for a Pope to build an intelligent chatbot.
Today, that same possibility is credited to machine-learning algorithms, which can predict which ad you will click, which movie you will watch, which word you will type next. We can be forgiven for falling under the same spell.
The Prometheus of the 18th century was Jacques de Vaucanson, at least according to Voltaire:
Bold Vaucanson, rival of Prometheus,Seems, imitating the springs of nature,To steal the fire of heaven to animate the body.

Vaucanson was a great machinist, famous for his automata. These were clockwork devices that realistically simulated human or animal anatomy. Philosophers of the time believed that the body was a machine — so why couldn't a machinist build one?
Sometimes Vaucanson's automata were scientifically significant. He constructed a piper, for example, that had lips and lungs and fingers, and blew the pipe in much the same way a human would. Historian Jessica Riskin explains in her book The Restless Clock that Vaucanson had to make significant discoveries in acoustics in order to make his piper play in tune.
Sometimes his automata were less scientific. His digesting duck was hugely famous, but turned out to be fraudulent. It appeared to eat and digest food, but its poos were in fact prefabricated pellets hidden inside the mechanism.
Vaucanson spent decades working on what he called a "moving anatomy." In 1741, he presented a plan to the Lyons Academy to build an "imitation of all animal operations." Twenty years later, he was at it again. He secured support from King Louis XV to build a simulation of the circulatory system. He claimed he could build a complete, living artificial body.

There is no evidence that Vaucanson ever completed a whole body. In the end, he couldn't live up to the hype. But many of his contemporaries believed he could do it. They wanted to believe in his magical mechanisms. They wished he would seize the fire of life.
If Vaucanson could manufacture a new human body, couldn't he also repair an existing one? This is the promise of some AI companies today. According to Dario Amodei, CEO of Anthropic, AI will soon allow people "to live as long as they want." Immortality seems like an attractive investment.
Sylvester II and Vaucanson were great technologists, but neither was a Prometheus. They stole no fire from the gods. Will the aspiring Prometheans of Silicon Valley succeed where their predecessors have failed? If only we had Sylvester II's brazen head, we could ask it.
This edited article is republished from The Conversation under a Creative Commons license. Read the original article.
]]>In a new study, researchers tested people's ability to distinguish images of real faces from AI-generated ones and found that most participants missed most of the AI-generated faces. Even "super-recognizers" — an elite group with exceptionally strong facial-processing abilities — were able to correctly identify fake faces as fake only 41% of the time. Typical recognizers correctly identified only about 30% of the AI-generated faces.
However, the study also showed that people's detection of fake faces improved when they were given just five minutes of training beforehand. The training taught the participants how to spot common computer-rendering errors, such as unnatural-looking skin textures or oddities in how hair lies across the face. After training, detection accuracy increased substantially, with super-recognizers spotting 64% of fake faces and typical recognizers identifying 51%.
Given how difficult the task proved to be even for highly skilled participants, how confident are you in your own ability to spot AI faces? Answer our poll below, and let us know why in the comments.
In a new study, researchers tested people's ability to distinguish images of real faces from AI-generated ones and found that most participants missed most of the AI-generated faces. Even "super-recognizers" — an elite group with exceptionally strong facial-processing abilities — were able to correctly identify fake faces as fake only 41% of the time. Typical recognizers correctly identified only about 30% of the AI-generated faces.
However, the study also showed that people's detection of fake faces improved when they were given just five minutes of training beforehand. The training taught the participants how to spot common computer-rendering errors, such as unnatural-looking skin textures or oddities in how hair lies across the face. After training, detection accuracy increased substantially, with super-recognizers spotting 64% of fake faces and typical recognizers identifying 51%.
Given how difficult the task proved to be even for highly skilled participants, how confident are you in your own ability to spot AI faces? Answer our poll below, and let us know why in the comments.
—Brain quiz: Test your knowledge of the most complex organ in the body
—Live Science crossword puzzle: Test your knowledge on all things science with our weekly puzzle
]]>This email was from a thrift shop, Thrifty Boutique in Chilliwack, B.C. — unlike the many queries archeologists receive every year to authenticate objects that people have in their possession.
The shop wanted to determine whether items donated to the store (and initially put up for sale) were, in fact, ancient artifacts with historical significance. Shop employees relayed that a customer, who did not leave their name, stated the 11 rings and two medallions (though one may be a belt buckle) in the display case with a price tag of $30 were potentially ancient.
Thrifty Boutique wasn't looking for a valuation of the objects, but rather guidance on their authenticity.
The disparities between the two objects, suggesting different time periods, make it unlikely they're from the same hoard. We expect they were assembled into an eclectic collection by the unknown person (as of yet) who acquired them prior to their donation to Thrifty Boutique.
With the exciting revelation that the objects may be authentic ancient artifacts, the thrift store offered to donate them to SFU's archeology museum. The museum had to carefully consider whether it had the capacity and expertise to care for these objects in perpetuity, and ultimately decided to commit to their care and stewardship because of the potential for student learning.
Officially accepting and officially transferring these objects to the museum took more than a year. We grappled with the ethical implications of acquiring a collection without known provenance (history of ownership) and balanced this against the learning opportunities that it might offer our students.
As archeology faculty, we analyzed these objects with Babara Hilden, director of Museum of Archaeology and Ethnology at Simon Fraser University, after the store arranged to bring the items to the museum.
Our initial visual analysis of the objects led us to suspect that, based on their shapes, designs and construction, they were ancient artifacts most likely from somewhere within the boundaries of what was once the Roman Empire. They may date to late antiquity (roughly the third to sixth or seventh century) and/or the medieval period.
The initial dating was based largely on the decorative motifs that adorn these objects. The smaller medallion appears to bear a Chi Rho (Christogram), which was popular in the late antiquity period. The larger medallion (or belt buckle) resembles comparable items from the Byzantine Period.

The disparities between the two objects, suggesting different time periods, make it unlikely they're from the same hoard. We expect they were assembled into an eclectic collection by the unknown person (as of yet) who acquired them prior to their donation to Thrifty Boutique.
With the exciting revelation that the objects may be authentic ancient artifacts, the thrift store offered to donate them to SFU's archeology museum. The museum had to carefully consider whether it had the capacity and expertise to care for these objects in perpetuity, and ultimately decided to commit to their care and stewardship because of the potential for student learning.
Officially accepting and officially transferring these objects to the museum took more than a year. We grappled with the ethical implications of acquiring a collection without known provenance (history of ownership) and balanced this against the learning opportunities that it might offer our students.

Learning to investigate the journey of the donated objects is akin to the process of provenance research in museums.
In accepting items without known provenance, museums must consider the ethical implications of doing so. The Canadian Museums Association Ethics Guidelines state that "museums must guard against any direct or indirect participation in the illicit traffic in cultural and natural objects."
When archeological artifacts have no clear provenance, it is difficult — if not impossible — to determine where they originally came from. It is possible such artifacts were illegally acquired through looting, even though the Canadian Property Import and Export Act exists to restrict the importation and exportation of such objects.
We are keenly aware of the responsibility museums have to not entertain donations of illicitly acquired materials. However, in this situation, there is no clear information — as yet — about where these items came from and whether they are ancient artifacts or modern forgeries. Without knowing this, we cannot notify authorities nor facilitate returning them to their original source.
With a long history of ethical engagement with communities, including repatriation, the Museum of Archaeology and Ethnology is committed to continuing such work. This donation would be no different if we're able to confirm our suspicions about their authenticity.
Archeological forgeries, while not widely publicized, are perhaps more common than most realize — and they plague museum collections around the world.
Well-known examples of the archeological record being affected by inauthentic artifacts are the 1920s Glozel hoax in France and the fossil forgery known as Piltdown Man.
Other examples of the falsification of ancient remains include the Cardiff Giant and crystal skulls, popularized in one of the Indiana Jones movies.
Various scientific techniques can help determine authenticity, but it can sometimes prove impossible to be 100 per cent certain because of the level of skill involved in creating convincing forgeries.

Other copies of ancient artifacts exist for honest purposes, such as those created for the tourist market or even for artistic purposes. Museums full of replicas still attract visitors, because they are another means of engaging with the past, and we are confident that the donation therefore has a place within the museum whether the objects are authentic or not.
By working closely with the objects, students will learn how to become archeological detectives and engage with the process of museum research from start to finish. The information gathered from this process will help to determine where the objects may have been originally uncovered or manufactured, how old they might be and what their original significance may have been.
Object-based learning using museum collections demonstrates the value of hands-on engagement in an age of increasing concern about the impact of artificial intelligence on education.
The new archeology course we have designed, which will run at SFU in September 2026, will also focus heavily on questions of ethics and provenance, including what the process would look like if the objects — if determined to be authentic — could one day be returned to their country of origin.
The students will also benefit from the wide-ranging expertise of our colleagues in the department of archeology at SFU, including access to various technologies and avenues of archeological science that might help us learn more about the objects.
This will involve techniques such as X-ray fluorescence, which can be used to investigate elemental compositions of materials and using 3D scanners and printers to create resources for further study and outreach.

—
—
—
Local museum professionals have also agreed to help mentor the students in exhibition development and public engagement, a bonus for many of our students who aspire to have careers in museums or cultural heritage.
Overall, the course will afford our students a rare opportunity to work with objects from a regional context not currently represented in the museum while simultaneously piecing together the story of these items far from their probable original home across the Atlantic.
We are excited to be part of their new emerging story at Simon Fraser, and can't wait to learn more about their mysterious past.
This edited article is republished from The Conversation under a Creative Commons license. Read the original article.
]]>Milestone: James Webb Space Telescope launches
Date: Dec. 25, 2021
Where: Guiana Space Centre, Kourou, French Guiana
Who: NASA, European Space Agency and Canadian Space Agency scientists
On a cloudy winter's day, in the Amazon jungle, a shuttle blasted off into space — and changed our view of the universe forever.
The James Webb Space Telescope (JWST) left Earth aboard an Ariane 5 rocket at 25,000 mph (40,000 km/h) "from a tropical rainforest to the edge of time itself," according to a live broadcast from NASA.
About a month later, it reached its orbiting parking place in space, a gravitationally-stable Lagrange point 930,000 miles (1.5 million kilometers) away, in perfect equilibrium between Earth and the sun's gravity. The telescope would beam back its first, spectacular pictures in July 2022. And the firehose of data it has sent back since has transformed our understanding of the cosmos.
JWST has been so pivotal in part because it can peer back to the "cosmic dawn," a period a few hundred million years after the Big Bang, when the first stars were winking on.
"The James Webb Space Telescope has proven itself capable of seeing 98% of the way back to the Big Bang," Peter Jakobsen, an affiliate professor of astrophysics at the University of Copenhagen in Denmark, previously told Live Science in an email.
Yet Webb, which was first conceived at Lockheed Martin in the late 1990s, almost didn't launch at all. The now-iconic, $10 billion project was catastrophically over budget, plagued by years' worth of delays and snarled by "stupid mistakes."
That was in part because, when it launched, it was by far the most complex telescope ever built.
It took more than 20,000 engineers and hundreds of scientists to design, build and launch the eye in the sky. That 21.3 feet (6.5 meter) mirror had to be folded into a honeycomb shape to be lofted on a rocket, then unfolded once in space. Yet despite being foldable, it also had to be so smooth that if it were as big as a continent, "it would feature no hill or valley greater than ankle height," according to Quanta Magazine.

To see the earliest epochs of cosmic history, Webb needed infrared vision. That's because ancient light has been stretched, or red-shifted, into infrared wavelengths as it travels across space-time. On Earth, humans and every other living thing give off heat in the form of infrared radiation, and that would drown out the faint infrared signals from the most distant, ancient starlight. So JWST needed to be lofted into the cold dark of outer space to use its infrared instruments.
Once JWST started imaging the cosmos, it promptly began breaking our existing models of the universe. It rapidly confirmed the Hubble tension — the discrepancy between the universe's expansion rates depending on where and what astronomers measure. It has found hints of potentially life-sustaining atmospheres shrouding distant exoplanets. And it has spotted shockingly bright galaxies and seemingly "impossible" black holes at the dawn of time. All these clues are pointing to new understandings of the universe.
Some of the questions JWST is raising, such as whether other planets harbor life, it will probably not be able to answer in its planned 10-year lifespan. But future telescopes — such as the currently operational Vera C. Rubin Observatory, meant to create a real-time "movie of the universe"; the recently completed Nancy Grace Roman Telescope, set to launch in 2027 and resolve questions about dark matter and energy; the Extremely Large Telescope, set to turn on in 2029; or the recently announced Habitable Worlds Observatory, which may come online in the 2030s — could start to answer the questions that Webb is raising.
]]>We all know the Christmas story is one in which peace and joy are proclaimed, and this permeates our festivities, family gatherings and present-giving. Countless Christmas cards depict the Holy Family – starlit, in a quaint stable, nestled comfortably in a sleepy little village.
However, when I began to research my book on the childhood of Jesus, Boy Jesus: Growing up Judaean in Turbulent Times, that carol started to sound jarringly wrong in terms of his family’s actual circumstances at the time he was born.
The Gospel stories themselves tell of dislocation and danger. For example, a “manger” was, in fact, a foul-smelling feeding trough for donkeys. A newborn baby laid in one is a profound sign given to the shepherds, who were guarding their flocks at night from dangerous wild animals (Luke 2:12).
When these stories are unpacked for their core elements and placed in a wider historical context, the dangers become even more glaring.
Take King Herod, for example. He enters the scene in the nativity stories without any introduction at all, and readers are supposed to know he was bad news. But Herod was appointed by the Romans as their trusted client ruler of the province of Judaea. He stayed long in his post because he was – in Roman terms – doing a reasonable job.
Jesus’ family claimed to be of the lineage of Judaean kings, descended from David and expected to bring forth a future ruler. The Gospel of Matthew begins with Jesus’ entire genealogy, it was that important to his identity.
But a few years before Jesus’ birth, Herod had violated the tomb of David and looted it. How did that affect the family and the stories they would tell Jesus? How did they feel about the Romans?
As for Herod’s attitude to Bethlehem, remembered as David’s home, things get yet more dangerous and complex.
When Herod was first appointed, he was evicted by a rival ruler supported by the Parthians (Rome’s enemy) who was loved by many local people. Herod was attacked by those people just near Bethlehem.
He and his forces fought back and massacred the attackers. When Rome vanquished the rival and brought Herod back, he built a memorial to his victorious massacre on a nearby site he called Herodium, overlooking Bethlehem. How did that make the local people feel?

And far from being a sleepy village, Bethlehem was so significant as a town that a major aqueduct construction brought water to its centre. Fearing Herod, Jesus’ family fled from their home there, but they were on the wrong side of Rome from the start.
They were not alone in their fears or their attitude to the colonisers. The events that unfolded, as told by the first-century historian Josephus, show a nation in open revolt against Rome shortly after Jesus was born.
When Herod died, thousands of people took over the Jerusalem temple and demanded liberation. Herod’s son Archelaus massacred them. A number of Judaean revolutionary would-be kings and rulers seized control of parts of the country, including Galilee.
It was at this time, in the Gospel of Matthew, that Joseph brought his family back from refuge in Egypt – to this independent Galilee and a village there, Nazareth.
But independence in Galilee didn’t last long. Roman forces, under the general Varus, marched down from Syria with allied forces, destroyed the nearby city of Sepphoris, torched countless villages and crucified huge numbers of Judaean rebels, eventually putting down the revolts.
Archelaus – once he was installed officially as ruler – followed this up with a continuing reign of terror.
As a historian, I’d like to see a film that shows Jesus and his family embedded in this chaotic, unstable and traumatic social world, in a nation under Roman rule.
Instead, viewers have now been offered The Carpenter’s Son, a film starring Nicholas Cage. It’s partly inspired by an apocryphal (not biblical) text named the Paidika Iesou – the Childhood of Jesus – later called The Infancy Gospel of Thomas.
You might think the Paidika would be something like an ancient version of the hit TV show Smallville from the 2000s, which followed the boy Clark Kent before he became Superman.
But no, rather than being about Jesus grappling with his amazing powers and destiny, it is a short and quite disturbing piece of literature made up of bits and pieces, assembled more than 100 years after the life of Jesus.
The Paidika presents the young Jesus as a kind of demigod no one should mess with, including his playmates and teachers. It was very popular with non-Jewish, pagan-turned-Christian audiences who sat in an uneasy place within wider society.
The miracle-working Jesus zaps all his enemies – and even innocents. At one point, a child runs into Jesus and hurts his shoulder, so Jesus strikes him dead. Joseph says to Mary, “Do not let him out of the house so that those who make him angry may not die.”
Such stories rest on a problematic idea that one must never kindle a god’s wrath. And this young Jesus shows instant, deadly wrath. He also lacks much of a moral compass.
But this text also rests on the idea that Jesus’ boyhood actions against his playmates and teachers were justified because they were “the Jews”. “A Jew” turns up as an accuser just a few lines in. There should be a content warning.
The nativity scene from The Carpenter’s Son is certainly not peaceful. There is a lot of screaming and horrific images of Roman soldiers throwing babies into a fire. But, like so many films, the violence is somehow just evil and arbitrary, not really about Judaea and Rome.
It is surely the contextual, bigger story of the nativity and Jesus’ childhood that is so relevant today, in our times of fracturing and “othering”, where so many feel under the thumb of the unyielding powers of this world.
In fact, some churches in the United States are now reflecting this contemporary relevance as they adapt nativity scenes to depict ICE detentions and deportations of immigrants and refugees.
In many ways, the real nativity is indeed not a simple one of peace and joy, but rather one of struggle – and yet mystifying hope.
This edited article is republished from The Conversation under a Creative Commons license. Read the original article.
]]>The Jupiter-size world, detected by the James Webb Space Telescope (JWST), doesn't have the familiar helium-hydrogen combination we are used to in atmospheres from our solar system, nor other common molecules, like water, methane or carbon dioxide.
Rather, the planet seems to have soot clouds near the top of the atmosphere that condense into diamonds deeper in the atmosphere. This kind of overall atmosphere, which is made of helium and carbon, has never been spotted on another planet. What's even weirder is that its host star is not even a normal star.
"This was an absolute surprise," study co-author Peter Gao, a staff scientist at the Carnegie Earth and Planets Laboratory, said in a statement. "I remember after we got the data down, our collective reaction was, 'What the heck is this?' It's extremely different from what we expected."
Researchers probed the bizarre environment of the planet, known as PSR J2322-2650b, in a paper published Tuesday (Dec. 16) in The Astrophysical Journal Letters. Although the planet was detected by a radio telescope survey in 2017, it took the sharper vision of JWST (which launched in 2021) to examine PSR J2322-2650b's environment from 750 light-years away.
PSR J2322-2650b orbits a pulsar. Pulsars are fast-spinning neutron stars — the ultradense cores of stars that have exploded as supernovas — that emit radiation in brief, regular pulses that are visible only when their lighthouse-like beams of electromagnetic radiation aim squarely at Earth. (That's bizarre on its own, as no other pulsar is known to have a gas-giant planet, and few pulsars have planets at all, the science team stated.)
The infrared instruments on JWST can't actually see this particular pulsar because it is sending out high-energy gamma-rays. However, JWST's "blindness" to the pulsar is actually a boon to scientists because they can easily probe the companion planet, PSR J2322-2650b, to see what the planet's environment is like.
"This system is unique because we are able to view the planet illuminated by its host star, but not see the host star at all," co-author Maya Beleznay, a doctoral candidate in physics at Stanford University, said in the statement. "We can study this system in more detail than normal exoplanets."

PSR J2322-2650b's origin story is an enigma. It is only a million miles (1.6 million kilometers) from its star — nearly 100 times closer than Earth is to the sun. That's even stranger when you consider that the gas giant planets of our solar system are much farther out — Jupiter is 484 million miles (778 million km) from the sun, for example.
The planet whips around its star in only 7.8 hours, and it's shaped like a lemon because the gravitational forces of the pulsar are pulling extremely strongly on the planet. At first glance, it appears PSR J2322-2650b could have a similar formation scenario as "black widow" systems, where a sunlike star is next to a small pulsar.
In black-widow systems, the pulsar "consumes" or erodes the nearby star, much like the myth of the black widow spider’s feasting behavior after which the phenomena is named. That happens because the star is so close to the pulsar that its material falls onto the pulsar. The extra stellar material causes the pulsar to gradually spin faster and to generate a strong "wind" of radiation that erodes the nearby star.
But lead author Michael Zhang, a postdoctoral fellow in exoplanet atmospheres at the University of Chicago, said this pathway made it difficult to understand how PSR J2322-2650b came to be. In fact, the planet's formation appears to be unexplainable at this point.
"Did this thing form like a normal planet? No, because the composition is entirely different," Zhang said in the statement. "It's very hard to imagine how you get this extremely carbon-enriched composition. It seems to rule out every known formation mechanism."
Scientists still can't explain how the soot or diamonds are present in the exoplanet's atmosphere. Usually, molecular carbon doesn't appear in planets that are very close to their stars, due to the extreme heat.
One possibility for what happened comes from study co-author Roger Romani, a professor of physics at Stanford University and the Kavli Institute for Particle Astrophysics and Cosmology. After the planet cooled down from its formation, he suggested, carbon and oxygen in its interior crystallized.
But even that doesn't account for all of the odd properties. "Pure carbon crystals float to the top and get mixed into the helium … but then something has to happen to keep the oxygen and nitrogen away," Romani explained in the same statement. "And that's where the mystery [comes] in."
Scientists hope to continue studying PSR J2322-2650b. "It's nice to not know everything," Romani said. "I'm looking forward to learning more about the weirdness of this atmosphere. It's great to have a puzzle to go after."
]]>Most massive stars reach the ends of their lives by collapsing and exploding as supernovas, seeding the cosmos with elements such as carbon and iron. A different kind of cataclysm, known as a kilonova, occurs when the ultradense remnants of dead stars, called neutron stars, collide, forging even heavier elements like gold.
The newly identified event, named AT2025ulz, appears to combine these two types of cosmic explosions in a way that scientists have long hypothesized but never before observed.
If confirmed, it could represent the first example of a "superkilonova," a rare hybrid blast in which a single object produces two distinct but equally dramatic explosions.
"We do not know with certainty that we found a superkilonova, but the event nevertheless is eye opening," study lead author Mansi Kasliwal, a professor of astronomy at Caltech, said in a statement.
The findings are detailed in a study published Dec. 15 in The Astrophysical Journal Letters.
AT2025ulz first caught astronomers' attention on Aug. 18, 2025, when gravitational wave detectors operated by the U.S.-based Laser Interferometer Gravitational-Wave Observatory (LIGO) and its European partner, Virgo, registered a subtle signal consistent with the merger of two compact objects.
Soon after, the Zwicky Transient Facility at Palomar Observatory in California spotted a rapidly fading red point of light in the same region of the sky, according to the statement. The event's behavior closely resembled that of GW170817 — the only confirmed kilonova, which was observed in 2017 — with its red glow consistent with freshly forged heavy elements such as gold and platinum.
Instead of fading as astronomers typically expect, AT2025ulz began to brighten again, the study reported. Follow-up observations from a dozen observatories around the world, including Hawaii's Keck Observatory, showed the light shifting toward bluer wavelengths and revealing fingerprints of hydrogen, a hallmark of a supernova rather than a kilonova.
That data helped researchers confirm the presence of hydrogen and helium, indicating that the massive star had shed most of its hydrogen-rich outer layers before detonating, the paper noted.
To explain the baffling sequence, the team proposed that a massive, rapidly spinning star collapsed and exploded as a supernova. But instead of forming a single neutron star, its core split into two smaller neutron stars. Those newborn remnants then spiraled together and collided within hours, triggering a kilonova inside of the expanding debris of the supernova.
The combined effect is a hybrid explosion in which the supernova initially masks the kilonova's signature, accounting for the unusual observations, the researchers wrote in the paper.
Clues from the gravitational-wave data bolster this idea. While the signal cannot precisely determine the individual masses of the two merging neutron stars, it does rule out scenarios in which both were heavier than the sun, the new paper noted.
The researchers find a 99% chance that at least one of the objects was less massive than the sun— an outcome that challenges conventional stellar physics, which predicts neutron stars should not weigh less than about 1.2 solar masses. Such lightweight neutron stars can form only when a very rapidly spinning star collapses, matching the scenario proposed for AT2025ulz, according to the statement.
However, the study noted that the complexity of the overlapping signals makes it difficult to rule out the possibility that the signals came from unrelated events that happened to occur close together. Ultimately, the only way to test the theory will be to find more such events using next-generation sky surveys such as those from Vera C. Rubin Observatory and NASA's upcoming Nancy Grace Roman Space Telescope, the researchers said.
"If superkilonovae are real, we'll eventually see more of them," study co-author Antonella Palmese, an assistant professor of astrophysics and cosmology at Carnegie Mellon University in Pennsylvania, said in a different statement. "And if we keep finding associations like this, then maybe this was the first."
She may have looked warily over her shoulder as she walked, on alert for saber-toothed cats or hyenas. She may have used her strong arms to climb the shrubby trees nearby, searching for fruit, eggs, or insects to eat. Or perhaps she simply rested on the shores of the croc-infested waters, gulping down water on a hot day.
She likely had no idea it was her last day on Earth.
Roughly 3.2 million years later, her skeleton was unearthed by paleoanthropologist Donald Johanson and his team on the International Afar Research Expedition.
The stunningly complete fossil was nicknamed "Lucy." And her remarkable species, Australopithecus afarensis, may have been our direct ancestor. Our discoveries about Lucy have transformed our understanding of humanity's tangled family tree.
Fifty years later, we know so much more about her species. In fact, anthropologists have learned so much about Lucy and her kind that we can now paint a picture of how she lived and died.
Her last day may have been filled with companionship, but it also entailed a relentless search for food. And it was likely dominated by the ever-present fear of predators.
"I suspect that the last day in her life was filled with danger," Johanson told Live Science.

The modern story of Lucy began on Nov. 24, 1974, in Hadar, Ethiopia. Johanson and then-graduate student Tom Gray stumbled upon a bone poking out of a gully. Following two weeks of careful excavation, their team recovered dozens of fossilized bones. Together, these bones made up 40% of the skeleton of a human ancestor, making it the most complete skeleton of an archaic human species that had ever been found.

Pamela Alderman, another member of the expedition, suggested the team nickname the skeleton Lucy, after the Beatles song "Lucy in the Sky with Diamonds."
"And it just became iconic," Johanson said, "a moniker that everybody knew."
Lucy’s discovery transformed the study of ancient human relatives.
"I was in high school when she was found," John Kappelman, a paleoanthropologist at the University of Texas at Austin, told Live Science. "It really did reset the way paleoanthropology worked."
Lucy's skeleton, along with subsequent discoveries of other fossils of her species, have given anthropologists a wealth of information about what is essentially the halfway point in human evolution. At 3.2 million years old, Lucy and her kind lived equidistant in time from our ape ancestors and contemporary humans.
"She's our touchstone," Jeremy DeSilva, a paleoanthropologist at Dartmouth College, told Live Science. "Everything sort of comes back to her as the reference point, and she deserves it."

One thing is fairly certain: Though there were some obvious differences, Lucy looked and acted a lot like us.
"If we saw her coming out of a grocery store today, we would recognize her as upright walking and some kind of human," Johanson said.
Although her strong arms and the shape of her finger bones suggest Lucy could climb trees, her pelvis and knees were clearly adapted to walking on two feet.
The size of Lucy's thigh bone also revealed that she was only about 42 inches (1.1 meters) tall and 60 to 65 pounds (27 to 30 kilograms) — about the size of a 6- or 7-year-old child today. And the eruption of her wisdom teeth showed that, although she was in her early teens when she died, she was a fully mature young adult.
"Australopithecus in general was maturing fast," DeSilva said, "and it makes sense if you're on a landscape full of predators." In species that are frequently prey, individuals that mature faster are more likely to pass on their genes. But australopithecines were unique—while their teeth and bodies matured quickly, their brains grew more slowly, telling us that they relied quite a bit on learning for survival, DeSilva said.
Her discovery also settled a debate that was raging in the early 1970s: Did our big brains evolve before we learned to walk upright? Lucy's head, which was not much bigger than a chimp's, showed the answer was no. Our ancestors became bipedal long before they evolved large brains.

Because her skeleton was found on its own, Lucy's "social life" is a little murkier than other parts of her daily life. But many researchers think she lived in a mixed-sex group of about 15 to 20 males and females, not unlike modern-day chimpanzees do.
And although there's no direct evidence, Lucy's skeletal maturity suggests she could have had a baby. Bringing that relatively large-headed newborn through her relatively narrow pelvis would have been challenging, which means she may have had the help of a primitive "midwife."
If Lucy had a baby, she also likely had a partner. Other A. afarensis fossils, such as those of Kadanuumuu, show male australopithecines were only slightly larger than females, which, in primates, usually corresponds to more monogamous pairings.
Lucy and her kind would have spent a significant amount of their time avoiding becoming another animal's lunch. "These small creatures would have been nice hors d'oeuvres for a sabertooth or a large cat or hyena," Johanson said.
Perhaps because of that omnipresent danger, the group likely relied on each other.
"I think they had each other's backs and helped each other out," DeSilva said, "especially when they were in dangerous situations."
A healed bone fracture seen in Kadanuumuu provides evidence that these primates cared for one another. Around 3.6 million years ago, this male australopithecine broke his lower leg. By the time he died, though, the break was fully healed.
"On that landscape with that many predators, no doctors, no hospitals, no casts, no crutches, how in the world do you survive if not for social assistance?" DeSilva said. "It's really strong evidence that they didn't leave each other for dead."
Lucy probably started her last day much like any other, waking up from the treetop nest made of branches and leaves where she slept, along with her group, before setting off to find food.
It's not clear whether she would have been alone or in a group when she left to forage; if she did have a baby, she may have carried it.
But there's no doubt that she would have spent a significant part of her day looking for food. She most likely ate a few staples, such as grasses, roots and insects, chemical elements in her tooth enamel showed. She may have happened upon the eggs of birds or turtles and promptly gobbled them up as tasty, protein-rich treats. And if she was lucky enough to come across a carcass of a large mammal, such as an antelope, that hadn't been picked clean, she and her troop mates may have pulled the flesh from the bone, using large rocks.
"They can't afford to be picky eaters as these slow bipeds in a dangerous environment," DeSilva said. "They're eating everything they can get their hands on."
However, there's no evidence that Lucy’s species used fire to cook any of their food.

In the past 50 years, we've created a picture of Lucy's last moments. It's not clear exactly why she was by the lake; maybe she was thirsty, or perhaps it was a great spot to look for food.
But there are two main theories for how she died.
"Perhaps she was down there at the water and — bam! — a crocodile comes out," Johanson said. "Crocodiles are incredibly fast, and it's a dangerous place if you're a little creature" like Lucy.
Johanson found one carnivore tooth mark on Lucy's pelvis, and it had not healed, meaning it occurred around the time of her death. Although the animal that made the mark has not been conclusively identified, "we know that australopithecines were preyed upon because there are a number of examples," Johanson said.
In 2016, Kappelman and his colleagues put forward an alternate ending for Lucy: a catastrophic fall from a tree.
Based on high-resolution CT scans and 3D reconstructions of Lucy's skeleton, Kappelman identified fractures in her right shoulder, ribs and knees that were unlike the typical fracturing that occurs in fossils crushed under the weight of dirt and rocks for millions of years.
"Something traumatic happened here during life," Kappelman said.
The kinds of fractures Lucy suffered are consistent with a fall from a considerable height, perhaps from a tall tree in which she was foraging for food.
I like to think all fossils are pretty special, but there's nothing like Lucy.
Jeremy DeSilva
"She hit on her feet and then her hands, which meant she was conscious when she hit the ground," Kappelman said. "I don't think she survived very long."
It's not clear whether she was alone when she died. But even if she was with others of her kind, they likely wouldn't have done much with her body.
There's no evidence that A. afarensis "bodies were treated any differently than any other animal," DeSilva said. "Maybe there was some curiosity around it, and then they carried on."
Primate researchers have documented other species' curiosity about inanimate bodies. For example, chimpanzees often care for the body for a few hours or days after death, sometimes guarding the body.
Lucy's group may have done the same for her until her body was naturally buried, which would have happened quite rapidly, perhaps by a flood or mudslide.
In the end, though, "we know very little about how any of these creatures died," Johanson said.

Thanks to Johanson's 1974 discovery of Lucy — as well as other important findings, like the "First Family" and the footprints at Laetoli in Tanzania — we now know quite a lot about A. afarensis.
"It was a highly successful species that was comfortable in lots of different habitats," Johanson said; A. afarensis fossils have been found in Kenya in addition to Ethiopia and Tanzania. "From an evolutionary perspective, her species was highly adaptable," he said.
Lucy has had a broad impact on the field of anthropology.
"The discovery of Lucy really hit the start button for looking in older and older sediments in Africa," Kappelman said. As a result, we have found numerous ancient hominin species and now have 50 years' worth of fossil evidence that human evolution was messy and complex.
Lucy was the only human ancestor discovered at Hadar. But a couple dozen miles away at Woranso-Mille, a paleontological site in Ethiopia, Yohannes Haile-Selassie, director of the Institute of Human Origins at Arizona State University, and his colleagues have found evidence of a strange land inhabited by multiple humanlike species between 3.8 million and 3.3 million years ago. For instance, Lucy's kind coexisted alongside another ancient relative, A. anamensis.
Would they have been friends, enemies, competitors or something in between? Right now, anthropologists still have little idea what this landscape teeming with ancient hominins would have looked like.
But perhaps 50 years from now, we'll have a better picture of how Lucy's kind interacted with these other ancient hominins. Even then, Lucy will likely remain one of the most famous fossils of all time.
"I like to think all fossils are pretty special," DeSilva said, "but there's nothing like Lucy."
Editor's note: This article was originally published in November, 2024 as part of a special package written for the 50th anniversary of the discovery of a 3.2 million-year-old A. afarensis fossil (AL 288-1), nicknamed "Lucy."
Greater access to large language models (LLMs) and AI tools has further fueled the dead internet conspiracy theory. This idea, posited in the early 2020s, suggested that the internet is actually dominated by AIs talking to and producing content for other AIs — with human-made and disseminated information a rarity.
When Live Science explored the theory, we concluded that this phenomenon has yet to emerge in the real world. But people now increasingly intermingle with bots — and one can never assume an online interaction is with another human.
Beyond this, low-quality content — ranging from articles and images, to videos and social media posts created by tools like Sora, ChatGPT and others — is leading to a rise in "AI slop." It can range from Instagram Reels showing videos of cats playing instruments or using weapons, to fake or fictional information being presented as news or fact. This has been fueled, in part, by a desire for more online content to drive clicks, draw attention to websites and raise their visibility in search engines.
"The challenge is that a combination of the drive towards search engine optimization [SEO] and playing to social media algorithms has led towards more content and less quality content. Content that's placed to leverage our attention economy (serving ads, etc.) has become the primary way information is served up," Adam Nemeroff, assistant provost for Innovations in Learning, Teaching, and Technology at Quinnipiac University in Connecticut, told Live Science. "AI slop and other AI-generated content is often filling those spaces now."

Mistrust of information on the internet is nothing new, with many false claims made by people with particular agendas, or simply a desire to cause disruption or outrage. But AI tools have accelerated the speed at which machine-generated information, images or data can spread.
SEO firm Graphite found in November 2024 that the number of AI-generated articles being published had surpassed the number of human-written articles. Although 86% of articles ranking in Google Search were still written by people, versus 14% by AI (with a similar split found in the information a chatbot served up), it still points to a rise in AI-made content. Citing a report that one in 10 of the fastest-growing YouTube channels shows AI-generated content only, Nemeroff added that AI slop is starting to negatively affect us.
"AI slop is actively displacing creators who make their livelihood from online content," he explained. "Publications like Clarkesworld magazine had to stop taking submissions entirely due to the flood of AI-generated writing, and even Wikipedia is dealing with AI-generated content that strains its community moderation system, putting a key information resource at risk."
While an increase in AI content gives people more to consume, it also erodes trust in information, especially as generative AI gets better at serving up images and videos that look real or information that seems human-made. As such, there could be a situation where a deeper mistrust in information, especially in media brands and news, leads to human-made content being seen as fake and AI-made.
"I always recommend assuming content is AI-generated and looking for evidence that it's not. It's also a great moment to pay for the media we expect and to support creators and outlets that have clear editorial and creative guidelines," said Nemeroff.
There are two sides to AI-generated content when it comes to the lens of trust.
The first is AI spreading convincing information that requires an element of savvy thinking to check and not take at face value. But the open nature of the web means it’s always been easy for incorrect information to spread, whether accidentally or intentionally, and there’s long been a need to have a healthy scepticism or desire to cross-reference information before jumping to conclusions.
"Information literacy has always been core to the experience of using the web, and it's all the more important and nuanced now with the introduction of AI content and other misinformation," said Nemeroff.
The other side of AI-generated content is when it's deliberately used to suck in attention, even if its viewers can easily tell it’s fabricated. One example, as flagged by Nemeroff, is of images of a displaced child with a puppy in the aftermath of Hurricane Helene, which was used to spread political misinformation.
Although the images were quickly flagged as AI-made, they still provoked reactions, therefore fueling their impact. Even obviously AI-made content can be either weaponized for political motivations or used to capture the precious attention of people on the open web or within social media platforms.
"AI content that is brighter, louder and more engaging than reality, and which sucks in human attention like a vortex … creates a "Siren" effect where AI companions or entertainment feeds are more seductive than messy, friction-filled, and sometimes disappointing human interactions." Nell Watson, an IEEE member and AI ethics engineer at Singularity University, told Live Science.

While some AI content might look slick and engaging, it might represent a net negative for the way we use the internet, forcing us to question if what’s being viewed is real, and to deal with a flood of cheap, synthetic content.
"AI slop is the digital equivalent of plastic pollution in the ocean. It clogs the ecosystem, making it harder to navigate and degrading the experience for everyone. The immediate effect is authenticity fatigue," Watson explained. "Trust is fast becoming the most expensive currency online."
There’s a flipside to this. The rise of inauthentic content could be counterbalanced by people being drawn to content that’s explicitly human-made; we could see better-verified information and "artisanal" content created by real people. Whether that’s delivered by some form of watermark or locked off behind paywalls and in gated communities on Discord or other forums, has yet to be seen. It's down to how people react to AI slop, and their growing awareness of such content, that will determine the shape of content in the future and how it ultimately affects people, Nemeroff said.
"If people find slop and communicate that slop isn't acceptable, people's consumer behaviors will also change with that," he said. "This, combined with our broader media diet, will hopefully lead people to make changes to the nutrition of what they consume and how they approach it."
AI-made content is only one part of how AI is changing the way that we use the internet. LLM-based agents already come built into the latest smartphones, for example. You'd also be hard-pressed to find anyone who hasn’t indirectly experienced generative AI, whether it was serving up information suggestions or offering the option to rework an email, generating an emoji or automatically editing a photo.
While Live Science’s publisher has strict rules on AI use (it certainly can't be used for writing or editing articles), some AI tools can help with mundane image-editing tasks, such as putting images on new backgrounds.
AI use, in other words, is inescapable in 2025. Depending on how we use it, it can influence how we communicate and socialize online — but more pertinently, it’s affecting how we seek and absorb information.
Google Search, for example, now has an AI overview serving up aggregated and disseminated information before external search results — something which a recently introduced AI Mode builds upon.
“We primarily used the internet via web addresses and search up to this moment. AI is the first innovation to disrupt that part of the cycle," Nemeroff adds. "AI chat tools are increasingly taking up internet queries that previously directed people to websites. Search engines that once handled questions and answers are now sharing that space with search-enabled chatbots and, more recently, AI agent browsers like Comet, Atlas, Dia, and others.”
On a surface level, this is changing the way people search and consume information. Even if someone types a query into a traditional search bar, it’s increasingly common that an AI-made summary will pop up rather than a list of websites from trusted sources.

"We are transitioning from an internet designed for human eyeballs to an internet designed for AI agents," Watson said. "There is a shift toward "Agentic workflows." Soon, you generally won't surf the web to book a flight or research a product yourself; your personal AI agent will negotiate with travel sites or summarize reviews for you. The web becomes a database for machines rather than a library for people."
There are two likely effects of this. The first is less human traffic to websites like Live Science, as AI agents scrape the information they feel a user wants — disrupting the advertising-led funding model of many websites.
"If an AI reads the website for you, you don't see the ads, which forces publishers to put up paywalls or block AI scrapers entirely, further fracturing the information ecosystem," said Watson. This fracturing could even see websites shutting down, given the already turbulent state of online media, further leading to a reduction in trusted sources of information.
The second is a situation where AI agents end up searching, ingesting and learning from AI-generated content.
"As the web fills with synthetic content — AI slop — future models train on that synthetic data, leading to a degradation of quality and a detachment from reality," Watson said. Slop or solid information, this all plays into the dead internet theory of machines interacting with other machines, rather than humans.
"Socially, this risks isolating us," Watson added. "If an AI companion is always available, always agrees with you, and never has a bad day, real human relationships feel exhausting by comparison. Information-seeking will shift from ’Googling’ — which relies on the user to filter truth from fiction — to relying on trusted AI curators. However, this centralises power; we are handing our critical thinking over to the algorithms that summarise the world for us."
Undoubtedly, the ways in which humans are using the internet, and the World Wide Web it supports, have been changed by AI. AI has affected every aspect of internet use in 2025, from how we search for information, to how content is generated and how we are served the information we asked for. Even if you choose to search the web without any AI tools, the information you see could have been produced or handled by some form of AI.
As we’re currently in the midst of this change, it’s hard to be clear on what exactly the internet will look like as the trend continues. When asked about whether AI could turn the internet into a "ghost town," Watson countered: "It won’t be so much a ghost town as a zombie apocalypse."
It’s hard not to be concerned by this damning assessment, whether you're a content creator directly affected by AI or simply an end user who’s getting tired of questioning information.
However, Nemeroff highlighted that we can learn from the rise of social media and its impact on the internet in the late 2000s. It serves as an example of the disruption and challenges that such platforms faced when it comes to the use and spread of information.
"Taking a few pages out of what we learned about social media, these technologies were not without harms, and we also did not anticipate a number of the issues that emerged at the beginning," he said. "There is a role for responsible regulation as part of that, which requires lawmakers to have an interest in regulating these tools and knowing how to regulate in an ongoing way."
When it comes to any new technology — self-driving cars being one example — regulation and lawmaking are often several steps behind the breakthroughs and adoption.
It’s also worth keeping in mind that while AI poses a challenge, the agentic tools it offers can also better surface information that might otherwise remain buried deep in search results or online archives — thereby helping uncover information from sources that might not have thrived in the age of SEO.
The way humans react to AI content on the internet will likely govern how it evolves, potentially bursting an AI bubble by retreating to human-only enclaves on the web or requiring a higher level of trust signals from both human- and AI-made content.
"We find ourselves in a really challenging moment with this," concluded Nemeroff. "Being familiar with the environment and knowing its presence there is a key point to both changing the incentives around this as well as communicating what we value to the platforms that distribute it. I think we will start to see more examples of showing the provenance of higher quality content and people investing in that."
]]>So how many of these key figures do you know? Try our new quiz and find out. We'll be dropping another number in the mix every day for you to guess, and if you prove yourself to be a numberphile, maybe you'll make it to the top of our leaderboard. All you need to do is register and your score will be saved, and be sure to leave a comment and share how you got on (but no spoilers please).
—Live Science crossword: Test your knowledge on all things science with our weekly, free puzzle!
—Periodic table of elements quiz: How many elements can you name in 10 minutes?
—How quickly can you name all 12 Apollo astronauts that walked on the moon?
]]>