Nature, Published online: 23 November 2023; doi:10.1038/d41586-023-03706-yPatrick Vallance, Chris Witty and others reflect on advising the UK government during the ongoing investigation into the country’s pandemic response.
Nature, Published online: 23 November 2023; doi:10.1038/d41586-023-03720-0Artificial intelligence can help to regain lost art treasures. Plus, brain cells that control how quickly mice eat have been identified and how to make ‘loss and damage’ climate funding work.
Nature, Published online: 22 November 2023; doi:10.1038/d41586-023-03682-3How to ensure polio doesn’t return after eradication, and the space explosion that’s baffling scientists.
Nature, Published online: 22 November 2023; doi:10.1038/s41586-023-06753-7Exposure to early variants of SARS-CoV-2 results in immune imprinting in mouse models and in humans, reducing neutralizing antibody titres against Omicron variants, which could be mitigated with multiple updated boosters.
Nature, Published online: 22 November 2023; doi:10.1038/s41586-023-06746-6Analysis of the group II intron ribonucleoprotein shows the molecular interactions involved in branchpoint adenosine recognition, lariat formation and exon ligation, providing clues to the evolutionary conservation of structural components and catalytic mechanisms in premessenger RNA splicing.
Nature, Published online: 22 November 2023; doi:10.1038/s41586-023-06762-6HIV-1 Env trimers bound to one, two and three CD4 molecules are observed at membrane–membrane interfaces between HIV-1 and CD4-decorated virus-like particles.
Nature, Published online: 22 November 2023; doi:10.1038/d41586-023-03561-xAn enzyme wielded by a common skin bacterium acts on nerve receptors that trigger intense itching.
Nature, Published online: 22 November 2023; doi:10.1038/d41586-023-03679-yScientists found the cells in mice — and say they could lead to a better understanding of human appetite.
Nature, Published online: 22 November 2023; doi:10.1038/d41586-023-03581-7Cattle and other animals were slaughtered for rituals during the fifth century bc in what is now Spain.
Until now it was unclear what walking speed was needed to reduce risk of type 2 diabetes
Walking faster is linked to a significantly lower risk of developing type 2 diabetes, according to the first global study of its kind.
Type 2 diabetes is one of the world’s major health threats, with its prevalence rising sharply in the last three decades, according to the World Health Organization.Continue reading...
On 6 March 1987, the Herald of Free Enterprise ferry set sail from the Belgian port of Zeebrugge heading for Dover. The bow doors were not shut and within seconds it flooded with seawater. It capsized and 193 passengers and crew died.
Bill (William) Yule, who has died aged 83, was a child psychologist at the Institute of Psychiatry (now the Institute of Psychiatry, Psychology & Neuroscience) in London and head of clinical psychology at what was then the Bethlem Royal and Maudsley hospital. He was asked to help the surviving children and speaking about it he said: “It totally changed my career and my life. I’d never come across such raw emotion.”Continue reading...
Science Weekly brings you episode one of a new mini-series from Full Story.
Pioneering Australian scientists Graeme Pearman, Lesley Hughes and Ove Hoegh-Guldberg saw the climate crisis coming. Pearman predicted the increase of carbon dioxide levels, Hughes foresaw the alarming number of species extinctions and Hoegh-Guldberg forecast the mass coral bleaching events we’re seeing today. All three went on to become some of the country’s most respected experts in their fields, travelling the globe, briefing leaders, and assuming the world would take action having heard their alarming findings.
In part one of this three-part series, these climate scientists reveal the moment they realised the planet was heading for catastrophe. What did they do when they found out? How did they think the world would respond? And how do they feel today, looking back on that moment of cognisance?Continue reading...
Cancer Research says Britain is lagging behind other comparable countries in terms of survival rates
Thousands of people are dying needlessly from cancer because the UK lags behind comparable countries when it comes to survival rates, a damning report says.
Big strides forward have been made in treating the disease over the past 50 years, according to the study by Cancer Research UK, but slow and late diagnosis coupled with treatment delays mean the progress is “at risk of stalling”.Continue reading...
Natural History Museum prepares to study pristine material gathered in Nasa’s Osiris-Rex mission
A teaspoon’s worth of dark dust and granules scooped from an asteroid 200m miles from Earth has arrived at the Natural History Museum in London, where scientists are preparing to unlock its secrets.
Researchers at the museum received 100mg of the pristine material, which at 4.6bn years old dates back to the dawn of the solar system, after Nasa’s Osiris-Rex mission stopped at asteroid Bennu in 2020 and returned samples to Earth in September.Continue reading...
Injecting stem cells into patients’ brains found to be safe and could stop further damage from the disease
Doctors are cautiously hopeful about a new multiple sclerosis therapy after finding that injecting stem cells into patients’ brains was safe and potentially protective against further damage from the disease.
The small, early stage trial was only able to assess whether injecting cells directly into the brain was well tolerated by patients, but in tests carried out in the year after treatment, researchers found hints that the cells may have a long-lasting, beneficial impact.Continue reading...
The answers to today’s power-hungry philosophers’ puzzle
Earlier today I set you this puzzle by one of the world’s leading logicians about a group of five power-hungry philosophers. Here it is again with the solution.
The philosopher’s ruling council
1 philosopher:proposed council 1
2: proposal rejected
3: 3 2
4: 2 4 1
5: 5 1 3Continue reading...
One person found to be infected with H1N2 strain, of which 50 cases have been reported globally since 2005
The UK has detected its first human case of H1N2, a swine flu strain that has been reported in 50 people worldwide since 2005.Continue reading...
Health officials scramble to trace contacts of person infected with H1N2 strain
Health officials are scrambling to trace contacts of a person infected with a new form of swine flu, after the UK detected its first human case of H1N2.
Fifty human cases of the strain have been reported globally since 2005. The new case is the first to be detected in the UK and is unrelated genetically to the previous cases.Continue reading...
The idea of a friend who helps you commit to your goals is growing in popularity – whether you want to get fit, write a novel or build your business
When news emerged that new US Speaker Mike Johnson and his teenage son monitored each other’s pornography intake, the concept of an “accountability partner” was probably unfamiliar to many people. Certainly, the Republican politician hasn’t done much to sell the idea. Rolling Stone magazine, revelling in the “creepy Big Brother-ness” of it all, detailed how the “faith-obsessed” politician used an app called Covenant Eyes. Deployed by churches and missionary groups, it sends out weekly reports flagging up all the potentially nefarious online content consumed by an individual and their appointed sin monitor.
“My accountability partner right now is Jack, my son. He is 17,” Johnson said in a resurfaced news clip from last year. “If anything objectionable comes up, your accountability partner gets an immediate notice.” The mind boggles. For a start, define objectionable? One person’s harmless rap video might be another’s NSFW (not safe for work) nightmare. And where does this level of micro-monitoring end? Today OnlyFans, the mostly pornographic subscriber platform, tomorrow fried chicken Deliveroos and unwise athleisure purchases?Continue reading...
2023 has been a year chock-full of developments in brain-computer interface (BCI) technology: devices that can decode brain signals and use them to control an external device, or use a device to alter brain signals. BCI developers Neuralink, Paradromics, and Synchron all reached important milestones in the clinical trial process, each bringing a fully implantable BCI a little closer to clinical reality in the United States. Those companies are developing BCIs that interact with the brain electrically, but a recently announced research collaboration is developing a BCI that uses a radically different medium: ultrasound.
In October, the companies Forest Neurotech and Butterfly Network announced a US $20 million joint research effort to pursue the development of an ultrasound-based BCI. Forest Neurotech will license Butterfly Network’s compact ultrasound-on-a-chip technology to develop a minimally invasive ultrasound BCI device for use in academic and research settings.
Most BCI systems measure brain activity by reading out electrical signals, and some also electrically stimulate the brain. Forest’s proposed device will instead use high-frequency sound waves to interface with the brain. The scientific principles guiding the approach are unusual but simple: Direct focused ultrasound (FUS) waves can change neurons’ action potentials—the ionic currents the brain cells use to communicate with each other. Ultrasound can also be used to estimate neural activity within brain regions by measuring local changes in blood flow via the doppler effect; that technique is known as functional ultrasound imaging (fUSI).
Forest Neurotech CTO William Biederman says that Butterfly’s technology will allow his team to build a BCI that, when implanted in the user’s skull, will use ultrasound to stimulate and record from the brain “with sub-millimeter accuracy.”
Ultrasound has certain benefits over other neural stimulation and imaging techniques. For stimulation, focused ultrasound waves can flexibly target specific regions throughout the brain from outside the skull. In contrast, electrical stimulation techniques are more spatially limited because electricity does not travel very far through brain tissue, so electrically stimulating deep regions of the brain requires invasive surgical procedures that place electrodes near the relevant sites. Electrically recording from the brain requires either deep implants or electrodes on the scalp that can only record activity from brain regions near the surface. Recording deep neural activity via ultrasound can’t be done through the skull, but it can be done by removing a piece of skull and laying the device on the surface of the brain.
Butterfly’s ultrasound hardware places the audio generation, steering, and recording capabilities of larger clinical ultrasound systems onto a single chip-sized device. Forest plans to use these ultrasound chips to stimulate brain regions with steered and focused ultrasound waves, and also to measure neural activity with fUSI.
Butterfly Network’s ultrasound-on-a-chip technology will be used in a new brain-computer interface. youtu.be
The fUSI technique estimates changes in neural activity by measuring changes in blood flow. Neurons, like all cells, need blood to function. Increases in neural activity require increases in blood flow, which fUSI techniques measure by projecting ultrasound at a brain region of interest and recording the waves that bounce back. When sound bounces off of a mass of flowing blood, the returning sound waves wiggle at a different frequency than those emitted. fUSI uses this doppler shift phenomenon to estimate changes in blood flow and, by proxy, the electrochemical chatter of neurons.
“Using blood flow to image neural activity with functional ultrasound works really well,” says Mikhail Shapiro, an advisor to Forest Neurotech and professor at the California Institute of Technology. “It works much better than anybody expected when the idea was first put forward.”
Ultrasound can both record and control neural activity. While the latter phenomenon has been known since at least the 1950’s, scientists still aren’t sure exactly why FUS makes neurons fire. Recent experiments in which ultrasound energy was directed at isolated brain tissue from rodents showed neuroscientists that particular calcium ion channels appear to be thrown open by high-frequency sound waves, though the exact physics of that interaction are still a mystery.
To both stimulate and record with its BCI, Forest’s system will require implanting multiple Butterfly ultrasound chips in a user’s skull. While ultrasound stimulation is effective through bone, fUSI recording techniques are not. Sound is muffled when passing through the skull, and the outgoing and return attenuations make a fully non-invasive ultrasound recording system untenable. Instead, the company plans to place the transducers into the skull and flush with the surface of the dura, a protective membrane that covers the brain.The collaboration between Forest and Butterfly is a part of Butterfly Network’s larger “Butterfly Gardens” program, which seeks to make its technology available to medical device teams. Forest Neurotech, a part of the nonprofit group Convergent Research, is focusing on technological development rather than creating and marketing of a clinical device. “Within the lifetime of our nonprofit research organization, we don’t have plans to submit to the FDA,” says Biederman. “We are trying to drive fundamental scientific discovery and uses of this type of technology.” Once the technology is more developed, the company plans to make it available to other research organizations in academia and industry for further product and scientific development.
The compound eyes of an insect are structurally very different from that of a human—almost a separate category of organ altogether from the more familiar ocular arrangement of lens, cornea, iris, and retina. For instance, Papillio xuthus, the Asian Swallowtail Butterfly, can see wavelengths of light in the ultraviolet spectrum, far beyond the limits of human vision or the capabilities of standard optical electronics in cameras and mobile phones.
Researchers at the University of Illinois Urbana-Champaign (UIUC), inspired by P. xuthus’ unique eyes, have developed a new kind of camera sensor that can detect ultraviolet light. Recently published in Science Advances, the new device extends the capabilities of a real-time optical sensor with a layer of fluorescent crystal that makes it sensitive to ultraviolet light (UV). This design, which mimics the function of fluorescent compounds in the eyes of P. xuthus, may enable the development of new medical devices and imaging technologies.
“As an engineer, you always wonder: what principles can we mimic from [nature’s] visual systems to make better cameras?”
—Viktor Gruev, University of Illinois Urbana-Champaign
The xuthus butterfly can see UV light because of two complementary mechanisms. One is a set of cells in their eyes that, much like the shortest cone cells in human eyes, are most sensitive to blue light but are also partially sensitive to near-blue UV light. However, to see shorter wavelengths of light in the UVA and UVB ranges—280 to 400 nm—the butterfly uses a clever trick: cells containing fluorescent dyes within the insect’s eyes radiate a band of green light when excited by UV radiation. These dyes translate otherwise invisible UV light into a signal that the butterfly can see.
Viktor Gruev, a professor at UIUC and an author of the present study, says that the idea to mimic P. xuthus’ visual system in silico resulted from years of collaboration between him and biologists studying how different animals see. “I’ve been working with visual biologists for over ten years,” says Gruev, whose previous collaborations have produced other bio-inspired cameras. “As an engineer, you always wonder: what principles can we mimic from those visual systems to make better cameras?”
The UIUC team built the sensor using a layered CMOS design with a top layer of fluorescent perovskite crystal. Similar to a Foveon chip, the three photosensitive layers of the sensor divide the spectrum of visible light into three bands with peak sensitivities at blue, green, and red. These bands are determined by the width and position of each layer, with red light penetrating down to the lowest sensor layer and blue light only activating the highest layer.
The UV imaging sensor is seen here under white light [top] and under UV light [bottom]. The green appearance is attributed to PNC layer fluorescence.University of Illinois Urbana-Champaign/Science Advances
While this CMOS sensor arrangement does mimic the partial UV sensitivity of the layered photoreceptor cells in xuthus’ eyes, the real magic in the UIUC chip comes from a layer of perovskite crystal deposited on the sensor’s surface. The perovskite nanocrystal (CsPbBr3) is both fluorescent under UV radiation and transparent to visible light. This fluorescence causes the crystal layer to glow green when irradiated with UV light.
Just like the fluorescent dyes found in P. xuthus eyes, the perovskite nanocrystals translate incoming UV light into a wavelength that the CMOS sensor already responds to. Paired with the partial sensitivity of the blue-band sensor layer to near-UV light, the perovskite fluorescence enables the sensor to resolve exact wavelengths of UV light—as small as 250 nm—in real time and at any point on its surface. “The UV light is detected in two components,” says Shuming Nie, a UIUC professor and another author on the paper. “The first is the UV absorbed by the nanocrystals that convert it into a visible signal—in our case, green—and that’s picked up by the photodiode. There’s also a residue UV picked up by the top layer photodiode. We can use the ratio of those two to calculate the UV wavelength.”
The researchers say this is a big step forward in UV sensing technology. “There is no, that I’m aware of, UV sensing technology that gives you a 2D image in real time with exact wavelength resolution all at once,” says Gruev. “UV spectrometers can give you more spectral information, but only at a single point.”
The researchers propose that their fast, flexible sensor has direct biomedical applications. Cancer cells overproduce certain kinds of proteins and amino acids fluoresce in the UV spectrum when irradiated. Clinical devices can use this principle to detect cancerous tissues quickly and directly. The researchers show in the paper that the device can resolve the distinct fluorescent UV signatures of the cancer-indicating proteins and amino acids tyrosine, elastin, tryptophan, and nicotinamide adenine dinucleotide (NADH). They also show that the device can distinguish between lab cultures of cancerous and non-cancerous cells based on their UV fluorescence.
In future work, the researchers plan to expand on their design by adding multiple perovskite crystal layers that fluoresce in different bands, which would provide better UV wavelength resolution. “A better way to resolve wavelengths is with two-color nanocrystals,” says Nie. “This gives you more real estate, more space, to resolve the UV wavelengths.” Adding more color-tuned crystal layers would heighten the device’s cancer detection capabilities, Nie says. “With only one layer of nanocrystal,” he continues, “Some of the UV signals from amino acids in tumor cells are clustered together. If we use the two-color crystals, they are better separated.”
Marc Gauthier was 36 years old when he was diagnosed with Parkinson’s disease. At the age of 44, Marc, who lives in Pessac, France, was treated with a deep brain stimulator device that helps to control tremors in his arms. As he grew older, the disease made it difficult for Marc to walk, causing him to lose balance and fall up to five or six times a day.
In 2021, at the age of 61, Marc underwent an experimental medical procedure that has given him back his mobility. Scientists and doctors with the NeuroRestore group at the Swiss Federal Institute of Technology (EPFL) in Lausanne, Switzerland have surgically implanted an array of epidural electrodes within Marc’s vertebrae. These electrodes can stimulate neurons in Marc’s spine in a way that enables his nerves and muscles that control his gait and balance to function more like they did before he developed Parksinon’s. The team documented the procedure in a paper published 6 November in the journal Nature Medicine.
Parkinson’s disease is a neurological disorder characterized by the progressive loss of motor control. While the disease develops differently in every individual, Parkinson’s generally leads to a nervous-system-wide inability to produce the neurotransmitter dopamine and a loss of the neurons that transmit it. These dopaminergic neurons are necessary for many brain functions, including fine motor control. As these neurons are lost, Parkinson’s patients develop the tremors commonly associated with the disease.
“You’re taking advantage of the circuitry of the spinal cord. The spinal cord is doing it for you.” —Eduardo Moraud, NeuroRestore
To counteract these tremors, the research team at EPFL developed a novel intervention: targeted electrical stimulation to the patient’s spine. The researchers had previously seen success using the same stimulator system for spinal injury patients. The team found that in primate models of Parkinson’s, tuned patterns of electrical stimulation applied to the spinal cord could restore a smooth and steady gait to primates with impaired mobility. The team then sought to translate these results to a clinical treatment, and found a willing participant in Marc through their connections to the Lausanne and Bourdeaux medical community. “He was never scared,” says researcher and NeuroRestore group member Eduardo Moraud. “He was very courageous in that sense.”
The technology works by electrically stimulating structures on the spine called dorsal root ganglia, where sensory nerves return to the spine from the body’s extremities. Normally, the dorsal roots coordinate the delicate balance of muscles required for smooth, easy walking. By stimulating the neurons in the dorsal ganglia, the epidural stimulator gets those reflex-driven circuits working as they were before the onset of Parkinson’s disease. “You’re taking advantage of the circuitry of the spinal cord,” says Moraud. “The spinal cord is doing it for you.”
The stimulator array—a pad of electrodes arranged in two columns—was surgically inserted alongside the back of Marc’s spinal cord. The array was positioned to specifically stimulate the dorsal ganglia that most directly controlled Marc’s gait. Once inserted between the vertebrae and the spinal cord, the stimulator was fastened to nearby vertebrae to keep it in place.
After surgical implantation and months of physical rehabilitation, Marc has recovered his ability to walk in a smooth, controlled manner. The device is capable of delivering different stimulation patterns, depending on the situation. The standard pattern is a constant stimulation to compensate for Marc’s baseline imbalances. While this baseline stimulation is enough to keep Marc walking at an easy pace, the stimulator system can drive more effortful leg motions—climbing stairs, getting up from chairs, or walking on uneven terrain—with a supplementary stimulation pattern driven by signals collected from a set of sensors placed on Marc’s feet and legs that measure his intended motions.
Speaking through a translator at a recent press conference, Marc stated, “I experienced a first rebirth twenty years ago when I received a deep-brain stimulation implant. I then experienced a second rebirth two years ago when I received the spinal stimulator. I can now walk with much more confidence and my daily life has profoundly improved.”With support from the Michael J. Fox Foundation for Parkinson’s Research and the medical device developer Onward, the research team at EPFL hopes to repeat the treatment in six more patients, beginning early next year.
Finding ways to interface electronics with our biology could help treat neurodegenerative diseases, improve prosthetics, and even lead to novel computing technologies. A chip made from organic electronics that can imitate the way the retina works brings that goal a step closer, its creators say.
Hooking conventional electronics up to biological systems can be challenging, because they are not biocompatible and their rigid structures don’t marry well with soft, living tissue. Organic electronics have emerged as a promising alternative as they can be made from flexible polymers that aren’t toxic and don’t illicit an immune response. Now, researchers have created a new organic electronic device that can react to light, transmit signals using ions and remember the amount of signal that has previously flowed through it, which mimics the way neurons in the retina operate.
Described in a paper in Nature Communications, the device is current just a proof-of-concept. Ottavia Bettucci, an assistant professor of organic chemistry at the University of Milano-Bicocca and co-author of the paper says that in the future the technology could potentially be used to create neural implants designed to treat a variety of conditions such as Parkinson’s disease, Alzheimer’s disease, or retinal implants to treat some forms of blindness.
“In all the pathologies in which the neural communication is not working in a proper way, our material and our device could help in restoring this interaction,” she says.
The materials used to build the device were chosen with the goal of being implanted in the body.
The researchers describe the device they’ve built as an organic photoelectrochemical transistor. It consists of two thin films of conductive polymer deposited onto a glass substrate, which are connected by an electrolyte. One of the two polymer films acts as a channel between source and drain electrodes, while the other acts as a gate.
Applying a voltage to the gate causes ions to migrate from the electrolyte into the channel changing it’s conductivity. But the researcher also modified the polymer at the gate by integrating photosensitive molecules, which means the gate can also be controlled by light. Crucially, the materials used to build the device were chosen with the goal of being implanted in the body.
“The gate terminal is just a complete organic biocompatible material, which potentially can make these be interfaced with biological cells.” says Francesca Santoro, a professor of neuroelectronic interfaces at RWTH Aachen University in Germany and co-author of the new study. “The fact that it’s made of polymer materials also makes it very good for being integrated into flexible substrates.”
The device shares key characteristics with biological neurons, says Santoro. In addition to using ions to modulate the electrical behavior of the device, the response to light is intensity dependent, much like biological photoreceptors. And the device was able to mimic the way connections between neurons strengthen or weaken over time based on the amount of activity they see—a key ingredient in learning.
The researchers showed that if the gate was subjected to two light pulses less than five seconds apart, a higher current was seen after the second pulse. But if they applied 500 light pulses in a row, one second apart, the conductivity of the channel fell with each pulse. Once the light signal was removed the conductivity gradually recovered, but still hadn’t returned to normal after more than half an hour.
This is similar to how things work in biology, says Santoro, where two complimentary processes known as short-term plasticity and long-term plasticity operate over different timescales to modulate the strength of neural connections. “This is important because a number of patterns that we have in our visual recognition system are also result of a long-term interaction, not just short-term,” she says.
While going from a proof-of-concept to a practical device will be a long road, Bettuci says the team made several decisions made sure to use cheap, commonly used materials and also significantly simplified the process of synthesizing the light-sensitive polymer. The main barrier to creating devices that can be implanted in the body will be switching from a glass substrate to something that is flexible and biocompatible, says Bettuci.
The current device also operate much slower than biological neurons, says Santoro, and it is only able to react to biological signals, not transmit them. That’s why her group is already working on a follow up device that can release chemical signals to create a feedback loop with biological systems.
As well as holding promise for developing fully organic neural implants, Santoro thinks the technology could also be used to interface conventional electronics with biological systems. She’s has several projects looking at how their artificial neurons could be used to drive prosthetics and they are also working with an ophthalmologist who has experience with retinal implants.
“He can really give us good direction on where we can make improvements to the platform, so it could potentially be integrated into existing or a new classes of implantable retinal devices,” she says.
Pixium’s PRIMA system, currently implanted in 47 users in Europe and the U.S., comprises a tiny implant in the retina, connected to a pair of video glasses and an external computing unit. The system is currently undergoing human clinical trials for the treatment of dry age-related macular degeneration (dry AMD).
If that sounds familiar, it’s a similar set-up to the one offered by Second Sight, a U.S. retinal implant company that, in facing looming bankruptcy proceedings, suddenly left its community of users in the lurch in 2020. The fallout from the company’s abandonment of its implantees was detailed in an award-winning investigation by IEEE Spectrum last year.
“You want to... think critically about what good innovation is if people can only use it for a small period of time.”
—Michele Friedner, University of Chicago
With two leading retinal implant companies now having run out of money, does that signal wider problems with the technology? And what does it mean for people using—or hoping to use—their systems?
There are important differences between Pixium’s and Second Sight’s systems. Despite being much smaller than Second Sight’s Argus II device, Pixium’s PRIMA has over six times as many electrodes, meaning it can produce a more detailed artificial image. And where the Argus II system sent visual data from the glasses to the implant via radio waves, PRIMA literally beams information from the glasses into the eye using near-infrared radiation. The same infra-red signal also powers the tiny implant.
Dry AMD affects hundreds of thousands more people in the US, and millions more worldwide, than the retinitis pigmentosa targeted by the Argus II, giving it a larger potential market.
Daniel Palanker, a professor of ophthalmology at Stanford University who licensed his technology to Pixium, remains bullish on its potential. “Many startups fail for many reasons. In some the technology doesn’t work, but in our case technology actually did work,” he told Spectrum. “It’s just the climate of fundraising has now turned very difficult.”
A feasibility study with a handful of patients in France indicated that the PRIMA system could be safely implanted, and restored some light perception in parts of the retina where users had lost all vision, according to the company. The results of a similar feasibility study in the U.S., and a much larger pivotal trial in Europe, are due early next year, Pixium says. The company says they then hope to move forward with a pivotal trial in America, and begin selling the PRIMA in Europe as soon as 2025, followed by the U.S. a year or two later.
But that all depends on what happens with Pixium now. The company is currently in a legal state similar to U.S. Chapter 11 bankruptcy, heading towards a state similar to U.S. Chapter 7. That would begin a liquidation process leaving Pixium to settle its debts as best it could, while its main assets—its staff, intellectual property, and the clinical trials—would hopefully be acquired by a buyer.
“The most important thing is that no patients will be hurt by this,” Lloyd Diamond, Pixium’s CEO, told Spectrum. “All patients have been implanted and the post-surgical period is over. There’s always a potential that there’s a latent problem later on. But these patients will continue to be followed, and they will be taken care of by the public health systems in the countries in which they reside.”
Diamond points out that all 47 PRIMA users were implanted under clinical trial conditions, rather than as paying customers like Second Sight, and thus enjoy more protection and support from the hospitals that operated on them.
However, he says that if Pixium cannot find a buyer in the coming weeks, their long-term prospects would be uncertain. “That’s where it becomes tricky,” says Diamond. “If there’s a problem with the device, there of course wouldn’t be individuals that could service it.”
An engineer of French startup Pixium Vision presents a tiny PRIMA retinal chip.Eric Piermont/AFP/Getty Images
That worries Michele Friedner, a medical anthropologist at the University of Chicago. “There’s all this focus on innovation and not on maintenance, the afterlife of that innovation,” she says. “You want to encourage innovation, but you also want to think critically about what good innovation is if people can only use it for a small period of time. When somebody uses it and then has to stop, it can be devastating on so many levels.”
This was the experience for many Second Sight patients, as Spectrum reported in 2022, some of whom resorted to home repairs or sourcing Argus II components from other users.
The connection between Second Sight and Pixium is more than incidental, too. After Second Sight had stopped making or supporting the Argus II, the company planned a merger with Pixium in 2021 that was intended “to create a global leader with the potential to treat nearly all forms of blindness,” according to a press release at the time.
That merger fell through, leading Pixium to sue Second Sight for pulling out of the deal. A French court awarded Pixium over 2.5 million Euros ($2.6 million) in December, although the American company is appealing the judgement.
“We don’t in any way take any pleasure in Pixium’s troubles,” says Jonathan Adams, CEO of Cortigent, the company that inherited Second Sight’s assets. “I hope Pixium is going to find a white knight.”
Second Sight’s own white knight, Cortigent’s parent company Vivani, has been able to help 24 stranded Argus II users with replacement glasses, computing units, and batteries from its dwindling stock, the company says. But rather than resurrecting Second Sight’s retinal technology, Cortigent is now pursuing a brain implant to help people with blindness. “We believe cortical stimulation could address a much larger number of people and turn the business model of retinal stimulation, which was not able to succeed, into something that could be very commercially attractive,” says Adams.
If Pixium Vision is unable to find a single buyer for its assets, there is a chance that Cortigent might bid for some of its intellectual property, says Adams. “But the application of it to the brain does provide some difference, and as we look into whether or not any of the Pixium IP could be of value to us, that difference may be important,” he says.
Diamond remains hopeful that Pixium will find a buyer prepared to invest the 60 million Euros ($64 million) he estimates it will take to bring PRIMA to the market. But others see the future of vision implants firmly in the brain. Elon Musk’s brain implant company Neuralink, which has proposed using its technology to restore vision, raised $280 million in August.
“By going directly to the vision cortex, it’s a much easier surgery than putting an electrode array in the retina,” says Adams. “And then what’s exciting is that you can take that array and move it to the motor cortex for recovery from stroke paralysis. We see much bigger market opportunities in the brain than trying to fine-tune particular applications for small groups of patients.”
As of presstime, the deadline for potential purchasers to submit offers for Pixium’s assets was November 20
Taking advantage of DNA’s ability to store information, researchers have over the past decade been working on data storage systems and digital logic circuits built on the biological molecule. DNA’s complex physical structure has also led to the exciting technique of DNA origami, with which researchers have made tiny nanoscale machines.
In the latest example of clever DNA origami, an international team of researchers have now developed a nanoengine that can perform pulsing movements. The pulsing nanomotor should be easy to couple to other tiny molecular machines, driving them to construct drugs and electronics one building block at a time. “We have designed the nanoengine as a broadly applicable module that can be coupled to all possible origami structures to set them in motion,” says Michael Famulok, a professor of chemical biology at the University of Bonn.
The tiny nanomotors look like the spring hand grips used for forearm-strengthening exercises. They have two rigid arms connected by a spring in a V-shaped structure that rhythmically open and close as long as the solution they are in has a chemical fuel.
University of Bonn
DNA origami involves carefully designing DNA sequences that can fold, twist, and staple together at the right places to assemble into preconceived structures. Using DNA origami tricks, scientists have made many different types of DNA nanomotors and machines. Many of these nanoassemblies are driven by Brownian motion: the constant, random movement of molecules.
Famulok and his colleagues have been working on building DNA nanosystems that are driven by chemical energy. Specifically, the fuel they use is RNA polymerase. This the natural enzyme that translates DNA molecules into single-stranded RNA molecules, which are then used to synthesize proteins. In 2018, the researchers reported a nanomachine comprised of a DNA ring that, driven by RNA polymerase, could be set into motion like a wheel.
In the new work, two stiff DNA origami arms are connected by a short DNA sequence that serves as a release spring. The whole structure is 70 nanometers by 70 nm by 12 nm in size. The researchers connected RNA polymerase on one of the arms near the DNA spring strand. The RNA grabs the DNA strand and starts translating its A, C, G, and T building blocks letter by letter, in the process pulling the strand and bringing the handles closer.
When it reaches the end of the DNA strand, a special termination sequence instructs the polymerase to let go of the strand. “The energy stored in the spring causes it to snap open,” Famulok says, and the arms return to their original positions. “Then the whole process starts over again. Once you have set it in motion, as long as [RNA polymerase] is present, it will continue to close and open.”
The biological molecules are robust, and there was no indication that the DNA spring would wear out in the laboratory experiments, Famulok says.
As a demonstration, the researchers attached the nanoengine to another V-shaped DNA origami arm that did not have a spring. As the nanoengine closed and snapped open, it drove the second “follower” unit to shut and release along with it.
Famulok’s vision is to use the pulsing nanomotor to drive several DNA origami machines “in a sort of molecular assembly line” that could build complex chemicals such as artificial proteins, or drive chemical reactions that are otherwise difficult to achieve. To do that, the researchers are now trying to devise a clutch-type mechanism that, in response to a chemical or other trigger signal, would detach itself from one follower unit and couple to another one.
Researchers have developed a magnetic gel that can help diabetic wounds heal roughly three times as fast as they would with a conventional wound dressing. The magnetic gel, in combination with an external device to generate a dynamic magnetic field gives the skin cells around the wound a gentle workout.
In a new study, researchers at the National University of Singapore focused on how to encourage chronic wounds, especially wounds in people with diabetes, to heal more quickly. Diabetes often compromises natural healing, and as such, diabetics often develop chronic wounds that are slow to heal.
For example, diabetic foot ulcers afflict 15 to 25 percent of the roughly 30 million people living with diabetes in the United States. These and other chronic wounds linked to diabetes are the leading cause worldwide of lower-limb amputations, after amputations necessitated by accidents or injuries. Currently, more than 540 million people globally are living with diabetes, according to the World Health Organization, and this number is expected to rise significantly over time.
“Approximately 10 to 15 percent of diabetic foot ulcers fail to heal,” says study senior author Andy Tay, a biomedical engineer at the National University of Singapore. “Shockingly, about a quarter of these non-healing cases eventually necessitate amputation.”
Conventional dressings for wounds “fail to fundamentally accelerate the healing of wounds. They mainly serve to prevent wounds from deteriorating, requiring patients to follow a schedule for dressing changes every two to three days over an extended period,” Tay explains. “The fundamental reason behind their limited effectiveness lies in their inability to simultaneously address multiple critical factors associated with diabetic wounds, as well as their incapacity to modify the local microenvironment.”
Andy Tay [center] holds a plaster loaded with magnetic gel, while Shou Yufeng [right] holds a device that provides magnetic stimulation. Le Zhicheng [left] is holding a sample of the magnetic gel in liquid form.National University of Singapore
In the new study, Tay and his colleagues investigated strategies to more actively treat chronic wounds, rather than wait for the body to take its time. They focused on the way in which the mechanical forces that a person’s skin normally experiences during daily activity can help spur healing. Those mechanical forces can trigger cell proliferation and migration, the secretion of growth factors, and the formation of new blood vessels. However, patients with wounds are usually advised not to walk around or undertake more rigorous activities.
The scientists found another way to create a mechanical force by experimenting with a gel that had been loaded with tiny magnetic particles. When exposed wirelessly to a dynamic magnetic field, the gel exerts a gentle, rhythmic mechanical force.
Aside from the magnetic particles, the researchers loaded the magnetic gel with two kinds of U.S. Food and Drug Administration-approved skin cells—keratinocytes, which are essential for skin repair, and fibroblasts, which help form connective tissue. The mechanical stimulation from the gel can help encourage the skin cells to become more active. The scientists also loaded this gel with insulin, which can influence blood sugar levels in wounds to help guide the rate of healing.
“One of the most compelling features of this platform is its holistic ‘all-in-one’ approach to wound healing, targeting multiple aspects of the healing process simultaneously,” Tay says.
The scientists placed this gel in a bandage to help treat wounds in diabetic mice. They found the bandages helped wounds heal roughly three times as fast as current approaches, after applying magnetic stimulation for about one or two hours.
The novel dressings also promoted healing in other ways, the researchers say. For example, the bandages improved fibroblast growth rate by about 240 percent, and more than doubled the production of collagen, a key protein for wound healing. In addition, the researchers found that magnetically directed mechanical stimulation of the gel could help control the rate at which insulin was delivered into wounds.
“Our technology takes a comprehensive approach, effectively targeting several critical factors linked to diabetic wounds,” Tay says. “It not only manages elevated glucose levels within the wound area, but also reactivates dormant skin cells near the wound, restores damaged blood vessels, and repairs the disrupted vascular network within the wound.”
The researchers noted this new technology may have applications beyond healing diabetic foot ulcers. For instance, it may help treat complex wounds such as burns, Tay says.
Currently, the researchers are conducting additional tests to refine the gel to enhance its effectiveness, Tay says. “In addition, we are collaborating with a clinical partner to assess the gel’s effectiveness using human tissues from diabetic patients,” he adds. “Together, we are working on addressing key challenges related to regulation, industrialization, and clinical implementation.”
The scientists detailed their findings online 8 September in the journal Advanced Materials. They have also filed a patent for the technology.
Blind and low-vision (BLV) people often use sound to navigate, such as via echolocation or within various assistive technologies. Scientists at the University of Technology Sydney (UTS) and Aria Research, a company that makes bionic devices, decided to blend the two to develop a technique called “acoustic touch.” When used in combination with smart glasses, acoustic touch converts objects in front of the user into auditory icons.
Acoustic touch uses head movement—head position also being key in echolocation—to dictate what sound icons play to support the exploration of a surrounding environment. Howe Yuan Zhu, one of the paper’s authors, describes acoustic touch as a user interface based on sensory-motor coupling. In this case, sensory feedback is generated by the wearer’s head movement.
Imagine that there is a virtual cone extending out in front of one’s head, he says. Any object in the region of this cone will be represented with a unique audio signature. “The wearable glasses might still see more objects, but it will only relay to the user the objects within this narrower field of view.” Their paper was published in PLOS One last month. The research team also included two blind researchers.
The researchers tested the interface with 14 participants—7 BLV individuals and 7 sighted people who were blindfolded—wearing acoustic touch-enabled smart glasses. They had to identify objects on a table before them. They found that the BLV participants performed well in recognizing and reaching for objects, without adding to their mental exertion.
“We were focused on understanding, is this learnable?”
—Howe Yuan Zhu, University of Technology Sydney
Wearables traditionally relay information using computer vision along with computer-generated speech. With acoustic touch, however, even though the glasses can detect everything before the wearer, the “head-scanning” movement creates a “middle layer.” This allows the user to decide what they want to explore first, and choose that.
“One of the key questions we weren’t sure about is how intuitive head movement was,” Zhu says. “We know in vision, it plays a key role in how we observe a room…but [with] audio, we know it plays a role, but not how much.” Their observations suggested that the head-scanning movement wasn’t something that required a significant amount of added effort. “Even though it was a bit more physical effort, the participants were still able to pick it up, and still found it somewhat intuitive,” Zhu says.
For their research, they also built a benchmark platform that used computer vision and object-recognition algorithms to recognize 120-odd objects. However, in the tests, they used only four objects—to understand if a user preferred the interface to play icons for all the objects, or if it was better for them to use head movement to selectively explore. “We were focused on understanding, is this learnable? Can someone build mental associations between a similar sound to the object? And then we compared that against speech,” Zhu says. “We don’t want to necessarily dismiss speech-based interfaces [either].” For example, for identifying a specific person, or to describe more complex objects, it might be simpler to use speech
He also clarifies that while their research is about the technique of acoustic touch rather than building smart glasses (which is something Aria Research does), they did make some technical observations that could be relevant in the future. For example, they noticed that the speed of head rotation could be quite important. “If [the user] sweeps too fast, they’re more likely to miss objects,” Zhu says, “Because the computer vision is just not fast enough.”
One of the main limitations in the current paper was that the study was heavily controlled and in a closed, indoor environment. These are not conditions that can be achievable in the real world, Zhu says. Another technical challenge was accurate object recognition. “But that’s something that’s continually improving,” he says. “And even between the time of the study and now…object recognition is a lot better.”
Since the publication of the paper, he says, they have been continuing their research using more real-world environments, wider contexts, and more objects. This includes navigating a maze using sound icons to dictate a path, or the way around an obstacle. Zhu notes that using computer speech is more likely to slow people down, as the person would need to stop to process it. “Whereas,” he adds, “if we played just the audio icon that pings for a lane, the person actually could just follow along, using it like a cue that you just walk along with.”
Is there a better way for scientists to shine a light on nerve cells throughout the body? When researchers Xinyue Liu and Siyuan Rao first began their collaboration at MIT, they treated this question literally.
Optogenetics is an interdisciplinary branch of science in which cells are genetically altered to be light-sensitive, making it possible to inhibit or excite cells, and study their function by applying colored light. Typically, the light-transmitting wires that reach target cells have been made of materials that work well when stationary in the brain. However, if they’re implanted elsewhere in a test animal’s body, they could break, or damage tissue, or affect behavior—making it difficult to study the peripheral nervous system and pain in particular.
“This flexible fiber expands the toolbox of approaches we have.”
—Rob Bonin, University of Toronto
Now, Liu, Rao, and colleagues have developed a soft, flexible, durable optical fiber capable of delivering an optogenetic light signal away from the brain or spine using a new material, hydrogel. The filament consists of an inner core and outer cladding of two versions of the hydrogel with different refractive properties, yet is only about one millimeter in diameter.
Researchers described the optical fiber and a variety of ways it was put through its paces in model mice in a paper in Nature Methods, published 19 October. The work adds another technique—and a bit of flexibility—to the repertoire of optogenetics, the study of the peripheral nervous system, and possibly future translational medicine, including the treatment of pain, chronic pain, and nerve disorders.
“This flexible fiber expands the toolbox of approaches we have for peripheral optogenetic work,” said Rob Bonin, a pain researcher at the University of Toronto who was not involved in the research, citing flexibility and durability as two major advantages of the new approach.
Broadly, hydrogels are soft networks of polymers and water, such as tofu or jelly. “Our body is also made of hydrogels. Except for bones and teeth, our muscles and other organs are all actually hydrogels,” said Liu, a materials scientist now at Michigan State University. The fiber uses a polyvinyl alcohol hydrogel, selected for its combination of optical properties and durability under repeated mechanical stress.
The investigation of soft materials was initiated with optogenetic pain research in the peripheral nervous system specifically in mind. “If your implant itself is causing pain, how are you going to use this technology to study pain?” said Rao, a neuroscientist now at the University of Massachusetts Amherst.
And although at the moment the hydrogel fiber primarily figures as a research tool in mice, the same qualities that set this new technology apart for basic science—durable and apparently comfortable in a freely moving body with no compromises in optical performance—are also positives for potential therapeutic purposes. “We are working towards that direction,” said Rao.
The technology promises a wide range of potential applications beyond just the brain and spine.
Researchers anchored one end of their fiber to the mice’s skulls, threaded it underneath the skin, and wrapped a cuff at the other end around the sciatic nerve in the leg. From a practical standpoint, this made the implant compatible with existing external light sources, and kept mice from scratching at any element of the device. But it also worked as a demonstration that enabled a full range of motion of the subject. At a mouse scale, the fiber only needed to be 6 centimeters long, but the authors said it could be extended for other uses.
A series of tests showed that the fiber transmitted light, as well as its performance in the mice, both blocking pain caused by a hot plate on the foot, and inducing movement in the leg. Critically, it performed well after several weeks of voluntary exercise wheel use, which researchers estimated added up to thousands of bends and twists.
Other optogenetic studies of the peripheral nervous system in mice have attempted various methods of light delivery that don’t use an optical fiber at all, instead shining light through the skin or implanting miniaturized remote devices. In comparison, the new hydrogel fiber should be able to more precisely target specific cells, said Rao.
For Bonin, the external light source has its pros and cons, including higher intensity light and the possibility that a tether could affect behavior, respectively.
Federico Iseppon, a pain researcher at University College London who was not involved in the study, said that although the fiber may be relatively easy to use, it will still require specialized knowledge to fabricate and surgically implant. It promises a wide range of potential applications beyond just the brain and spine. “Its plasticity lies in the multiple different tissues that could be targeted with this technology,” he said.
Liu is currently working on an interface, such as a patch, between the hydrogel and organs that would enable connections that the current cuff design doesn’t allow. Ideally, the fiber will eventually also let scientists record activity as well as send signals to cells.