Monday, May 20, 2024

Virtual reality stories can spur environmental action

A man wears a virtual reality headset and looks upward.

Compared to traditional video, environmental stories told through metaverse technologies, including virtual reality and 360-degree video, can better motivate people to act on environmental threats, researchers report.

Seeing is believing. But according to the new research, observation may not be enough to activate people on environmental issues. Engagement is key.

As described in a paper in the journal Cyberpsychology, Behavior, and Social Networking, the metaverse not only provides a fantastical visual experience but an interactive one that can make seemingly distant threats, like climate change or ocean acidification, feel close and personally relevant.

“The magic of VR isn’t just that it transports you somewhere, but it meaningfully uses interactivity to reduce psychological distance and increase immersion,” says Daniel Pimentel, an assistant professor in immersive media psychology at the University of Oregon’s School of Journalism and Communication.

“That is one of the biggest mechanisms in environmental storytelling that we’re not really focusing on but should be. It’s not merely enough to place people in digital environments; we have to ensure that when they’re there, they can engage with the story.”

Environmental communicators can utilize metaverse technologies to tell interactive stories that help change people’s attitudes and perceptions on environmental issues, says Pimentel, co-director of the Oregon Reality Lab in Portland.

In their latest research, Pimentel and his collaborator, Sriram Kalyanaraman of the University of Florida, conducted a series of studies testing whether storytelling through immersive media could shape people’s threat perceptions and engagement in pro-environmental activities.

They also surveyed people’s positive or negative attitudes toward the message and narrative.

“If you want to change minds and hearts, you need a story that people enjoy, right?” Pimentel says. “You want them to enjoy the storytelling experience, otherwise, it’s not going to resonate with them.”

In their first experiment, the researchers had study participants watch either an interactive, 360-degree video or a 2D equivalent on climate change in Alaska. As revealed in follow-up questionnaires, participants who watched the 360-degree video felt a greater sense of presence and closeness to the depicted threats than those who watched the 2D projection.

“I was probably the most surprised by this result,” Pimentel says, “because VR video alone, without even using a headset, already led to differences in how people conceptualized the information and thought of the threats more concretely and less abstractly.”

Because of VR’s immersive capabilities, the researchers wondered if distant environmental threats could feel closer to home. Similar to the first study, participants watched either a 360-degree video or flat equivalent but on an underwater exploration highlighting coral bleaching.

Some participants watched a video that says the incident took place locally in Florida—all participants were college students in Florida—whereas others were informed it was set distantly in South Africa.

The participants’ perceptions on a local issue didn’t change much whether it was presented in a 360-degree or 2D format. Pimentel suggests it’s difficult to make an already local issue feel closer.

But for a distant threat, the added interactivity heightened participants’ attitudes. “When it’s a distant story, 360-degree video really matters,” Pimentel says.

To investigate if a VR headset could elevate the experience, the researchers had some participants watch the 360-degree ocean video through a headset instead. Participants with the headset had a more naturalistic way of controlling their point of view, whereas those without had to move by clicking and dragging with a mouse.

Being fully immersed in audio and visual to a degree that a flat screen cannot achieve led these participants to report greater intentions to help the environment, Pimentel says.

“The more you engage with something, the more concrete and relevant it becomes,” he says. “Immersive media helps you perceive things as happening and occurring more presently. This research is a story of three studies that are saying the same thing: Interactivity increases cognitive absorption, which in turn leads to favorable evaluations of a message and how we see threats.”

But not every environmental story needs to be in the metaverse, Pimentel cautions. It needs to be justified, and in this case, it’s expensive to take people on scuba dives to learn about coral bleaching, he says.

“It’s less about using immersive media technologies for the sake of using them and rather thinking holistically, meaningfully and deeply about why you’re using these platforms,” Pimentel says. “What we’re trying to do with our research is understand what levers we can pull as storytellers that lead to particular outcomes.”

Source: Leila Okahata for University of Oregon

The post Virtual reality stories can spur environmental action appeared first on Futurity.



from Futurity https://ift.tt/H7bNDp0

System predicts who’s at risk of quitting opioid treatment

White pills spill from an orange pill bottle onto a reflective black surface.

Researchers have developed a system designed to identify patients at high risk of discontinuing buprenorphine treatment for opioid use disorder.

An FDA-approved prescription drug, buprenorphine is one of three commercially available treatments for opioid use disorder proven to be effective in treating both pain and addiction.

In a study published in the journal Computers in Biology and Medicine, Md Mahmudul Hasan, and his research team found that roughly 15% of patients did not complete the clinically recommended yearlong buprenorphine treatment, while about 46% of patients stopped treatment within the first three months.

With the help of artificial intelligence (AI), the team also identified high-risk patients and several factors associated with treatment discontinuation.

Hasan, an assistant professor in the University of Florida College of Pharmacy department of pharmaceutical outcomes and policy, says the retrospective study, which included insured individuals aged 18 to 64 who were prescribed buprenorphine to treat opioid use disorder, offers new insights to use in the fight against the national public health epidemic that claimed more than 80,000 lives in the United States in 2021.

The study measured gaps of 30 days or more when buprenorphine prescriptions weren’t filled within the first year of treatment. By building predictive models focusing on distinct treatment stages—the time of treatment initiation, one month, and three months following the start of treatment—Hasan’s team found that nearly 15% of patients discontinued treatment prematurely. The team notes this is a conservative estimate, as several patient exclusion criteria might have resulted in a lower discontinuation rate.

“We know that sticking with a buprenorphine treatment plan is beneficial. Premature discontinuation could increase the risk of hospitalization, drug overdose, and most importantly, mortality,” Hasan says. “If we can use AI to predict which patients are at a higher risk of this behavior, clinical practitioners can get to the root cause, make more informed decisions and design more targeted interventions for those patients.”

The researchers used a framework for machine learning prediction and risk stratification to help identify high-risk patients and determine which factors contribute to a lack of buprenorphine treatment compliance.

Risk factors identified in this study include age, gender, early treatment adherence, use of stimulants or antipsychotics, and the number of days’ supply associated with the first buprenorphine prescription that a patient receives. The study also found that living in rural areas and other treatment access barriers contribute to a higher risk of discontinuation.

“Younger patients are at a higher risk of prematurely stopping treatment, along with those with a history of stimulant use, including nicotine,” Hasan says. “We also found patients with lower buprenorphine adherence at the early treatment stage are more at risk of premature treatment discontinuation.”

When the technology developed in the study is available to medical centers across the country, it will save frontline clinicians precious time while giving patients more access to buprenorphine treatment, Hasan says.

“Primary care physicians are already overburdened and overworked, and they have limited resources. A tool like this that can reliably predict which patient will be high-risk could be helpful,” Hasan says.

“Within a short time and without increasing their workload, health care providers can identify the interventions needed for each patient, allowing them to best allocate their limited resources.”

Source: University of Florida

The post System predicts who’s at risk of quitting opioid treatment appeared first on Futurity.



from Futurity https://ift.tt/7LwGebA

Friday, May 17, 2024

Jelly sea creature ‘jet propulsion’ could give robots a boost

Transparent salps band together to swim through deep blue water.

Scientists have discovered that colonies of gelatinous sea animals swim through the ocean in giant corkscrew shapes using coordinated jet propulsion.

It’s an unusual kind of locomotion that could inspire new designs for efficient underwater vehicles.

Salps are small creatures that look similar to jellyfish that take a nightly journey from the depths of the ocean to the surface. Observing that migration with special cameras helped researchers capture the macroplankton’s graceful, coordinated swimming behavior.

“Salps are really weird animals.”

“The largest migration on the planet happens every single night: the vertical migration of planktonic organisms from the deep sea to the surface,” says Kelly Sutherland, an associate professor in biology at the University of Oregon’s Oregon Institute of Marine Biology, who led the research.

“They’re running a marathon every day using novel fluid mechanics. These organisms can be platforms for inspiration on how to build robots that efficiently traverse the deep sea.”

Despite looking similar to jellyfish, salps are barrel-shaped, watery macroplankton that are more closely related to vertebrates like fish and humans, says Alejandro Damian-Serrano, an adjunct professor in biology. They live far from shore and can live either as solitary individuals or operate in colonies, he says. Colonies consist of hundreds of individuals linked in chains that can be up to several meters long.

“Salps are really weird animals,” Damian-Serrano says. “While their common ancestor with us probably looked like a little boneless fish, their lineage lost a lot of those features and magnified others. The solitary individuals behave like this mothership that asexually breeds a chain of individual clones, cojoined together to produce a colony.”

But the most unique thing about these ocean creatures was found during the researchers’ ocean expeditions: their swimming techniques.

Exploring off the coast of Kailua-Kona, Hawaii, Sutherland and her team developed specialized 3D camera systems to bring their lab underwater. They conducted daytime scuba dives, “immersed in infinite blue,” as Damian-Serrano described, for high visibility investigations.

They also performed nighttime dives, when the black backdrop allowed for high-contrast imaging of the transparent critters. They encountered an immense flurry of different salps that were doing their nightly migration to the surface—and many photobombing sharks, squids, and crustaceans, Sutherland says.

Through imaging and recordings, the researchers noticed two modes of swimming. Where shorter colonies spun around an axis, like a spiraling football, longer chains would buckle and coil like a corkscrew. That’s called helical swimming.

Helical swimming is nothing new in biology, Sutherland says. Many microorganisms also spin and corkscrew through water, but the mechanisms behind the salps’ motion are different.

Microbes beat water with hair-like projections or tail whips, but salps swim via jet propulsion, Sutherland says. They have contracting muscle bands, like those in the human throat, that pump water sucked from one side of the body and squirted out the other end to create thrust, Damian-Serrano says.

The researchers also noticed that individual jets contracted at different times, causing the whole colony to steadily travel without pause. The jets were also angled, contributing to the spinning and coil swimming, Sutherland says.

“My initial reaction was really one of wonder and awe,” she says. “I would describe their motion as snake-like and graceful. They have multiple units pulsing at different times, creating a whole chain that moves very smoothly. It’s a really beautiful way of moving.”

Microrobots inspired by microbial swimmers already exist, Sutherland says, but this discovery paves the way for engineers to construct larger underwater vehicles. It may be possible to create robots that are silent and less turbulent when modeled after these efficient swimmers, Damian-Serrano says. A multijet design also may be energetically advantageous for saving fuel, he says.

Beyond microbes, larger organisms like plankton have yet to be described in this way, Sutherland says. With Sutherland’s new and innovative methods of studying sea creatures, scientists might come to realize that helical swimming is more pervasive than previously thought.

“It’s a study that opens up more questions than provides answers,” Sutherland says. “There’s this new way of swimming that hadn’t been described before, and when we started the study we sought to explain how it works.

“But we found that there are a lot more open questions, like what are the advantages of swimming this way? How many different organisms spin or corkscrew?”

The study is published in Science Advances. Additional coauthors are from Louisiana Universities Marine Consortium, University of South Florida, Roger Williams University, Marine Biological Laboratory, and Providence College.

The Gordon and Betty Moore Foundation and the Office of Naval Research supported the work.

Source: Leila Okahata for University of Oregon

The post Jelly sea creature ‘jet propulsion’ could give robots a boost appeared first on Futurity.



from Futurity https://ift.tt/UZqu8km

Wednesday, May 15, 2024

AI detects sex-related differences in brain structure

A group of men and women walk over a brain painted on concrete.

Artificial intelligence computer programs that process MRI results show differences in how the brains of men and women are organized at a cellular level, a new study shows.

The variations were spotted in white matter, tissue primarily located in the human brain’s innermost layer, which fosters communication between regions.

Men and women are known to experience multiple sclerosis, autism spectrum disorder, migraines, and other brain issues at different rates and with varying symptoms.

A detailed understanding of how biological sex impacts the brain is therefore viewed as a way to improve diagnostic tools and treatments. However, while brain size, shape, and weight have been explored, researchers have only a partial picture of the brain’s layout at the cellular level.

Led by researchers at NYU Langone Health, the new study used an artificial intelligence (AI) technique called machine learning to analyze thousands of MRI brain scans from 471 men and 560 women. Results revealed that the computer programs could accurately distinguish between biological male and female brains by spotting patterns in structure and complexity that were invisible to the human eye.

The findings were validated by three different AI models designed to identify biological sex using their relative strengths in either zeroing in on small portions of white matter or analyzing relationships across larger regions of the brain.

“Our findings provide a clearer picture of how a living, human brain is structured, which may in turn offer new insight into how many psychiatric and neurological disorders develop and why they can present differently in men and women,” says neuroradiologist Yvonne W. Lui, a professor and vice chair for research in the radiology department at NYU Grossman School of Medicine and senior author of the study in Scientific Reports.

Lui notes that previous studies of brain microstructure have largely relied on animal models and human tissue samples. In addition, the validity of some of these past findings has been called into question for relying on statistical analyses of “hand-drawn” regions of interest, meaning researchers needed to make many subjective decisions about the shape, size, and location of the regions they choose. Such choices can potentially skew the results, Liu says.

The new study results avoided that problem by using machine learning to analyze entire groups of images without asking the computer to inspect any specific spot, which helped to remove human biases, the authors say.

For the research, the team started by feeding AI programs existing data examples of brain scans from healthy men and women and also telling the machine programs the biological sex of each brain scan. Since these models were designed to use complex statistical and mathematical methods to get “smarter” over time as they accumulated more data, they eventually “learned” to distinguish biological sex on their own. Importantly, the programs were restricted from using overall brain size and shape to make their determinations, says Lui.

According to the results, all of the models correctly identified the sex of subject scans between 92% and 98% of the time. Several features in particular helped the machines make their determinations, including how easily and in what direction water could move through brain tissue.

“These results highlight the importance of diversity when studying diseases that arise in the human brain,” says co-lead author Junbo Chen, a doctoral candidate at NYU Tandon School of Engineering.

“If, as has been historically the case, men are used as a standard model for various disorders, researchers may miss out on critical insight,” adds co-lead author Vara Lakshmi Bayanagari, a graduate research assistant at NYU Tandon School of Engineering.

Bayanagari cautions that while the AI tools could report differences in brain-cell organization, they could not reveal which sex was more likely to have which features. She adds that the study classified sex based on genetic information and only included MRIs from cisgendered men and women.

According to the authors, the team next plans to explore the development of sex-related brain structure differences over time to better understand environmental, hormonal, and social factors that could play a role in these changes.

The National Institutes of Health and the United States Department of Defense supported the work.

Source: NYU

The post AI detects sex-related differences in brain structure appeared first on Futurity.



from Futurity https://ift.tt/pAwJaTd

Sleep apnea during REM contributes to verbal memory decline

A man sleeps in bed while wearing a sleep apnea mask.

New research reveals a link between the frequency of sleep apnea events during the rapid-eye-movement stage and the severity of verbal memory impairment in older adults at risk for Alzheimer’s disease.

Verbal memory refers to the cognitive ability to retain and recall information presented through spoken words or written text and is particularly vulnerable to Alzheimer’s.

As reported in the journal Alzheimer’s Research & Therapy, researchers discovered a specific correlation between the severity of sleep apnea—when breathing pauses while an individual is sleeping—and diminished cognition. Higher ratios during REM compared to non-REM stages were associated with worse memory performance.

“Our findings identified the specific features of sleep apnea that are associated with memory, which is important because clinically, events occurring during REM sleep are often overlooked or minimized,” says co-corresponding author Bryce Mander, associate professor of psychiatry and human behavior at the University of California, Irvine.

“Most hours of sleep are non-REM, so the overall averages of apnea severity can look much lower than what is typically observed during REM sleep. This means that someone at risk can be misdiagnosed and undertreated because current evaluation standards are not focused on sleep-stage-specific apnea severity.”

“Furthermore, we found that women are more likely to have a greater proportion of their apneic events in REM sleep in comparison to men, which could potentially be contributing to their greater risk for Alzheimer’s disease,” says co-corresponding author Ruth Benca, professor and chair of psychiatry and behavioral medicine at Wake Forest University School of Medicine.

The study involved 81 middle-aged and older adults from the Wisconsin Alzheimer’s Disease Research Center with heightened risk factors, of whom 62% were female. Participants underwent polysomnography—a comprehensive test that records brain waves, eye movements, muscle activity, blood oxygen levels, heart rate, and breathing during sleep—and verbal memory assessments.

Results showed apnea events during REM to be a critical factor contributing to verbal memory decline, especially among individuals with a genetic predisposition to Alzheimer’s and those with a parental history of the disease.

“Our findings highlight the intricate relationship among sleep apnea, memory function, and Alzheimer’s risk,” Mander says. “Identifying and addressing REM-specific events are crucial for developing proactive, personalized approaches to assessment and treatment that are tailored to individual sleep patterns.”

Kitty K. Lui, a graduate student in the San Diego State University/University of California, San Diego joint doctoral program in clinical psychology, is the study’s lead author. Additional coauthors are from UC Irvine, UC San Diego, the Wisconsin Alzheimer’s Disease Research Center, and the University of Kentucky.

The National Institute on Aging, the National Institutes of Health, and the National Center for Advancing Translational Sciences’ Clinical and Translational Science Awards Program funded the work.

Source: UC Irvine

The post Sleep apnea during REM contributes to verbal memory decline appeared first on Futurity.



from Futurity https://ift.tt/hb5MIvR

COVID virus can infect your eyes and damage vision

A group of plastic eyeballs sit on a gray surface.

The virus that causes COVID-19 can breach the protective blood-retinal barrier, leading to potential long-term consequences in the eye, new research shows.

The blood-retinal barrier is designed to protect our vision from infections by preventing microbial pathogens from reaching the retina where they could trigger an inflammatory response with potential vision loss.

Pawan Kumar Singh, an assistant professor of ophthalmology at the University of Missouri, leads a team researching new ways to prevent and treat ocular infectious diseases.

Using a humanized ACE2 mice model, the team found that SARS-CoV-2, the virus that causes COVID-19, can infect the inside of the eyes even when the virus doesn’t enter the body through the surface of the eyes.

Instead, they found that when viruses enter the body through inhalation, it not only infects organs like lungs, but also reaches highly protected organs like eyes through the blood-retinal barrier by infecting the cells lining this barrier.

“This finding is important as we increase our understanding of the long-term effects of SARS-CoV-2 infection,” says Singh. “Earlier, researchers were primarily focused on the ocular surface exposure of the virus.

“However, our findings reveal that SARS-CoV-2 not only reaches the eye during systemic infection but induces a hyperinflammatory response in the retina and causes cell death in the blood-retinal barrier. The longer viral remnants remain in the eye, the risk of damage to the retina and visual function increases.”

Singh also discovered that extended presence of SARS-CoV-2 spike antigen can cause retinal microaneurysm, retinal artery and vein occlusion, and vascular leakage.

“For those who have been diagnosed with COVID-19, we recommend you ask your ophthalmologist to check for signs of pathological changes to the retina,” Singh says. “Even those who were asymptomatic could suffer from damage in the eyes over time because of COVID-19 associated complications.”

While viruses and bacteria have been found to breach the blood-retinal-barrier in immunocompromised people, this research is the first to suggest that the virus that causes COVID-19 could breach the barrier even in otherwise healthy individuals, leading to an infection that manifests inside the eye itself.

Immunocompromised patients or those with hypertension or diabetes may experience worse outcomes if they remain undiagnosed for COVID-19 associated ocular symptoms.

“Now that we know the risk of COVID-19 to the retina, our goal is to better understand the cellular and molecular mechanisms of how this virus breaches the blood-retinal barrier and associated pathological consequences in hopes of informing development of therapies to prevent and treat COVID-19 induced eye complications before a patient’s vision is compromised,” Singh says.

The study appears in the journal PLOS Pathogens.

The National Institutes of Health/National Eye Institute and the University of Missouri funded the work.

Source: University of Missouri

The post COVID virus can infect your eyes and damage vision appeared first on Futurity.



from Futurity https://ift.tt/Riuser9

Tuesday, May 14, 2024

Meat prices will likely go up this summer

Burgers and hotdogs cook on a grill as the cook removes one burger with cheese on it with a spatula.

A new report provides insights on the future price of beef, pork, and chicken.

The Texas A&M University Food Price Predictor study integrates historical data, current market trends, and predictive models to offer a detailed projection of future retail meat prices.

Timed with grilling season, this report assists consumers in effectively planning for summer barbecues.

The team analyzed market data using statistical models to predict price changes for various meat products. The study focused on ground beef, chuck roast Choice, steak sirloin Choice, all pork chops, and boneless chicken breast—all typical cuts of meat consumed during the traditional grilling season. While a chuck roast is not considered a typical grilling cut, its prices provide some insight into other items such as ground beef and some steak cuts, such as flat iron steaks.

The study shows a modest increase in most meat prices for this summer. Notable trends include modest increases in beef and pork prices and a decrease in chicken breast prices.

Price changes that occurred from September through February included:

  • Ground beef: up 0.45%
  • Chuck roast: up 4.68%
  • Sirloin steak: up 0.77%
  • Pork chops: down 3.93%
  • Boneless chicken breast: down 2.93%

According to the report, meat price changes expected from May to October are:

  • Ground beef, 100% beef: up 0.1% to 0.7% or $5.13-$5.19 per pound.
  • Chuck roast Choice: up 0.7% to 1.3% or $7.21-$7.34 per pound.
  • Steak sirloin Choice: up 0.1% to 0.6% or $11.72-$11.78 per pound.
  • Pork chops: up 0.1% to 1% or $4.24-$4.28 per pound.
  • Boneless chicken breast: down 2.93% or $4.06-$3.91 per pound.

Price expectations are averaged across the US. The report notes retail prices are heavily influenced by retail location, price discounting, and other market variables.

Some of the key observations for the report are:

  • The anticipated slight increase in beef prices, particularly for ground beef and chuck roast, can be attributed to the seasonal surge in demand and reduced beef production.
  • The trend in lower chicken prices is largely due to efficiencies gained in poultry production, increasing production, and lower feed costs.
  • The slight uptick in sirloin steak prices is a response to a shift in consumer preferences toward higher quality cuts, fueled by an improving economy and reduced beef supplies.
  • The modest increase in pork chop prices aligns with the expected seasonal increase in demand combined with slightly constrained supply levels.
  • Greater profitability and lower feed costs should keep chicken supplies plentiful.

“These report estimates will provide US consumers with a better understanding of the factors that influence meat prices and help them estimate future meat costs, allowing households to budget more effectively,” says lead author Simon Somogyi, director of the Weston Agrifood Sales Program and endowed chair in the agricultural economics department.

Source: Texas A&M University

The post Meat prices will likely go up this summer appeared first on Futurity.



from Futurity https://ift.tt/0HJKktD