Thursday, August 28, 2025

Team figures out how cavefish lost their eyes

An eyeless cavefish in profile against a black background.

In a new study, researchers show when cavefishes lost their eyes, which provides a method for dating cave systems.

Small, colorless, and blind, amblyopsid cavefishes inhabit subterranean waters throughout the eastern United States.

In an analysis of the genomes of all known amblyopsid species, the researchers found that the different species colonized caves systems independently of each other and separately evolved similar traits—such as the loss of eyes and pigment—as they adapted to their dark cave environments.

Their findings appear in the journal Molecular Biology and Evolution.

By studying the genetic mutations that caused the fishes’ eyes to degenerate, the researchers developed a sort of mutational clock that allowed them to estimate when each species began losing their eyes. They found that vision-related genes of the oldest cavefish species, the Ozark cavefish (Troglichthys rosae), began degenerating up to 11 million years ago.

The technique provides a minimum age for the caves that the fishes colonized since the cavefish must have been inhabiting subterranean waters when their eyesight began devolving, the researchers say.

“The ancient subterranean ecosystems of eastern North America are very challenging to date using traditional geochronological cave-dating techniques, which are unreliable beyond an upper limit of about 3 to 5 million years,” says Chase Brownstein, a student in Yale’s Graduate School of Arts and Sciences, in the ecology and evolutionary biology department, and the study’s co-lead author.

“Determining the ages of cave-adapted fish lineages allows us to infer the minimum age of the caves they inhabit because the fishes wouldn’t have started losing their eyes while living in broad daylight. In this case we estimate a minimum age of some caves of over 11 million years.”

Maxime Policarpo of the Max Planck Institute for Biological Intelligence and the University of Basel is the co-lead author.

For the study, the researchers reconstructed a time-calibrated evolutionary tree for amblyopsids, which belong to an ancient, species-poor order of freshwater fishes called Percopsiformes, using the fossil record as well as genomic data and high-resolution scans of all living relevant species.

All the cavefish species have similar anatomies, including elongated bodies and flattened skulls, and their pelvic fins have either been lost or severely reduced. Swampfish (Chologaster cornuta), a sister to cavefish lineage that inhabits murky surface waters, also has a flattened skull, elongated body, and no pelvic fin. While it maintains sight and pigment, there is softening of the bones around its eyes, which disappear in cavefishes. This suggests that cavefishes evolved from a common ancestor that was already equipped to inhabit low-light environments, Brownstein says.

To understand when the cavefish began populating caves—something impossible to discern from the branches of an evolutionary tree—the researchers studied the fishes’ genomes, examining 88 vision-related genes for mutations. The analysis revealed that the various cavefish lineages had completely different sets of genetic mutations involved in the loss of vision. This, they say, suggests that separate species colonized caves and adapted to those subterranean ecosystems independently of each other.

From there, the researchers developed a method for calculating the number of generations that have passed since cavefish species began adapting to life in caves by losing the functional copies of vision-related genes.

Their analysis suggests that cave adaptations occurred between 2.25 and 11.3 million years ago in Ozark cavefish and between 342,000 to 1.70 million years ago (at minimum) and 1.7 to 8.7 million years ago (at maximum) for other cavefish lineages. The findings support the conclusion that at least four amblyopsid lineages independently colonized caves after evolving from surface-dwelling ancestors, the researchers say.

The maximum ages exceed the ranges of traditional cave-dating methods, which includes isotope analysis of cosmogenic nuclides that are produced within rocks and soils by cosmic rays, the researchers note.

The findings also suggest potential implications for human health, says Thomas Near, professor of ecology and evolutionary biology at Yale, and senior author of the study.

“A number of the mutations we see in the cavefish genomes that lead to degeneration of the eyes are similar to mutations that cause ocular diseases in humans,” says Near, who is also the Bingham Oceanographic Curator of Ichthyology at the Yale Peabody Museum.

“There is the possibility for translational medicine through which by studying this natural system in cavefishes, we can glean insights into the genomic mechanisms of eye diseases in humans.”

Additional coauthors are from the South Carolina Department of Natural Resources, the American Museum of Natural History, Florida State University, and Paris-Cité University.

Source: Yale

The post Team figures out how cavefish lost their eyes appeared first on Futurity.



from Futurity https://ift.tt/SteyEGK

Wednesday, August 27, 2025

New approach may relieve arthritic knees without drugs or surgery

A man clutches his knee in pain while sitting outside.

A new study reveals a potential way to relieve arthritic knees without drugs or surgery.

Nearly a quarter of people over the age of 40 experience painful osteoarthritis, making it a leading cause of disability in adults. Osteoarthritis degrades joint-cushioning cartilage, and there is currently no way of reversing this damage: the only option is to manage pain with medication, and eventually, joint replacement.

Researchers are now demonstrating the potential for another option: gait retraining.

By making a small adjustment to the angle of their foot while walking, participants in a year-long randomized control trial experienced pain relief equivalent to medication. Critically, those participants also showed less knee cartilage degradation over that period as compared to a group that received a placebo treatment.

Published in The Lancet Rheumatology and co-led by Scott Uhlrich of the University of Utah’s John and Marcia Price College of Engineering, these findings come from the first placebo-controlled study to demonstrate the effectiveness of a biomechanical intervention for osteoarthritis.

“We’ve known that for people with osteoarthritis, higher loads in their knee accelerate progression, and that changing the foot angle can reduce knee load,” says Uhlrich, an assistant professor of mechanical engineering.

“So the idea of a biomechanical intervention is not new, but there have not been randomized, placebo-controlled studies to show that they’re effective.”

With support from the National Institutes of Health and other federal agencies, the researchers were specifically looking at patients with mild-to-moderate osteoarthritis in the medial compartment of the knee—on the inside of the leg—which tends to bear more weight than the lateral, outside, compartment. This form of osteoarthritis is the most common, but the ideal foot angle for reducing load in the medial side of the knee differs from person to person, depending on their natural gait and how it changes when they adopt the new walking pattern.

“Previous trials prescribed the same intervention to all individuals, resulting in some individuals not reducing, or even increasing, their joint loading,” Uhlrich says.

“We used a personalized approach to selecting each individual’s new walking pattern, which improved how much individuals could offload their knee and likely contributed to the positive effect on pain and cartilage that we saw.”

In their first two visits, participants received a baseline MRI and practiced walking on a pressure-sensitive treadmill while motion-capture cameras recorded the mechanics of their gait. This allowed the researchers to determine whether turning the patient’s toe inward or outward would reduce load more, and whether a 5-degree or 10-degree adjustment would be ideal.

This personalized analysis also screened out potential participants who could not benefit from the intervention in instances where none of the foot-angle changes could decrease loading in their knees. These participants were included in previous studies, which may have contributed to those studies’ inconclusive pain results.

Moreover, after their initial intake sessions, half of the 68 participants were assigned to a sham treatment group to control for the placebo effect. These participants were prescribed foot angles that were actually identical to their natural gait. Conversely, participants in the intervention group were prescribed the change in foot angle that maximally reduced their knee loading.

Participants from both groups returned to the lab for six weekly training sessions, where they received biofeedback—vibrations from a device worn on the shin—that helped them maintain the prescribed foot angle while walking on the lab’s treadmill. After the six-week training period, participants were encouraged to practice their new gait for at least 20 minutes a day, to the point where it became natural. Periodic check-in visits showed that participants were adhering to their prescribed foot angle within a degree on average.

After a year, all participants self-reported their experience of knee pain and had a second MRI to quantitatively assess the damage to their knee cartilage.

“The reported decrease in pain over the placebo group was somewhere between what you’d expect from an over-the-counter medication, like ibuprofen, and a narcotic, like oxycontin,” Uhlrich says.

“With the MRIs, we also saw slower degradation of a marker of cartilage health in the intervention group, which was quite exciting.”

Beyond the quantitative measures of effectiveness, participants in the study expressed enthusiasm for both the approach and the results. One participant said, “I don’t have to take a drug or wear a device… it’s just a part of my body now that will be with me for the rest of my days, so that I’m thrilled with.”

Participants’ ability to adhere to the intervention over long periods of time is one of its potential advantages.

“Especially for people in their 30s, 40s, or 50s, osteoarthritis could mean decades of pain management before they’re recommended for a joint replacement,” Uhrlich says. “This intervention could help fill that large treatment gap.”

Before this intervention can be clinically deployed, the gait retraining process will need to be streamlined. The motion-capture technique used to make the original foot-angle prescription is expensive and time-consuming; the researchers envision this intervention to eventually be prescribed in a physical therapy clinic and retraining can happen while people go for a walk around their neighborhood.

“We and others have developed technology that could be used to both personalize and deliver this intervention in a clinical setting using mobile sensors, like smartphone video and a ‘smart shoe,'” Uhlrich says. Future studies of this approach are needed before the intervention can be made widely available to the public.

Additional researchers from the University of Utah, New York University, and Stanford University contributed to the work.

Support for the research came from federal research grants from the Department of Veterans Affairs, National Institutes of Health, and National Science Foundation.

Source: University of Utah

The post New approach may relieve arthritic knees without drugs or surgery appeared first on Futurity.



from Futurity https://ift.tt/FLP4to5

The location of trees affects home values

A dollar bill folded into the shape of a small house in front of a white background.

In densely populated urban areas, trees may be the leafy secret to an increase in property values.

Through a novel meta-analysis, Pamplin College of Business researchers used multiple data sets to confirm that in urban areas, trees add value to a property, but it depends on where those trees are.

Homes tend to be worth more when trees are nearby to provide shade and improve the look of the neighborhood. But on-site trees are not always desirable because they come with downsides such as maintenance or safety concerns, the study found.

“While tree cover supports owners’ property values, their tree cover benefits the greater community property values,” says Kevin Boyle, a professor of real estate at Virginia Tech.

The study, done in partnership with the US Forest Service, analyzed geographic information system data from the US Environmental Protection Agency for nine metropolitan areas to assess the value of tree coverage near properties. The study used the distances of less than 330 feet, between 330 feet and one-third of a mile, and one-third to two-thirds of a mile to represent the distance from property to nearby trees.

The data shows property values based on tree coverage vary by city. For comparison, Fresno, California, and Milwaukee, Wisconsin, were used to represent the lowest and highest property values affected by tree cover.

In Fresno, the existing tree cover on or near a property only adds about $9 to an average-priced property. In Milwaukee, that same tree cover adds roughly $3,500.

Increase the number of trees on or near a property by 10 percent, and that property in Frisco would decline in value by $114, or 0.1%. Put those trees between one-third of a mile and two-thirds of a mile, and the home’s value increases by $6,751, or 3.2%.

By contrast, the Milwaukee home’s value would increase by $39,701, or 24%, with more tree coverage within 330 feet. Distant tree cover, between one-third and two-thirds of a mile, adds another $27,194 to the home’s value, a 16% increase.

The study in Ecological Economics explains that community members support trees when they can benefit from the shade when walking or aesthetics when driving. It also notes the complexities of community trees because most urban property is privately owned. That means community leaders may have to entice property owners to retain and plant trees for the public good, even if it means taking on maintenance responsibilities.

The findings also consider the good and bad environmental impacts of trees in weather events such as high temperature or severe storms. During storms, trees could damage homes and property. But trees also work to reduce urban heat islands, absorb storm runoff, and more, bringing benefits for both property owners and community members.

“Our analysis of multiple datasets and the associated findings can be used by community leaders to understand the benefits of tree coverage,” Boyle says.

“It can also educate community members that their actions in support of tree coverage can help not only the greater community, but themselves as well through enhanced property values. It is important to strategically maintain tree cover while recognizing the concerns of property owners and community members.”

Source: Virginia Tech

The post The location of trees affects home values appeared first on Futurity.



from Futurity https://ift.tt/vfIYgiU

Northwest US will face higher lightning and wildfire risk

A forest burns as smoke turns the sky purple.

The Northwest can expect a widespread increase of days with cloud-to-ground lightning in the years to come—along with heightened wildfire risk—according to projections made with a unique machine-learning approach.

The new study in the journal Earth’s Future offers detailed projections of lightning across the Western US for the mid-21st century. The largest change in lightning is expected in parts of Idaho, Washington, and Oregon, with four to 12 more days of lightning per year in some parts of the Rocky Mountains.

Researchers also matched those lightning projections with future wildfire risk to calculate the changes in daily risk of lightning-caused fires. Although there is variability across the region in the projections, the trend was clear: a heightened risk of lightning-caused wildfires across 98% of Western lands susceptible to fire.

“The Northwest is emerging, in this study as well as in others, as the region where fire- and fire-related hazards are likely to increase substantially more than in other parts of the western US,” says Deepti Singh, an associate professor in the School of the Environment at the Washington State University Vancouver and coauthor of the paper.

The study adds urgency to the need to manage forests for wildfire risk and prepare at-risk communities for fires, as the planet continues to warm and wildfires grow in size and severity, the researchers say. Lightning already accounts for more than two-thirds of the acreage burned in wildfires across the West, but current global climate models are unable to directly simulate future lightning because they rely on geographic resolutions too coarse to capture the conditions that create it.

The machine-learning models developed in this study zoom in to create the most detailed picture yet of future lightning patterns and lightning-caused fire risk across the West.

“There are already a lot of studies that say future wildfire activity will increase in the Western US and that’s without even considering potential lightning increasing, which we’re showing is going to happen in many areas,” says Dmitri Kalashnikov, lead author of the paper.

“We’re also making projections for the near-term future—2031 to 2060. That period starts in just a few years, so it’s on our doorstep.”

To make these projections, Kalashnikov applied a machine learning technique known as a convolutional neural network. These neural network-based predictive models were tailor-made for each grid cell of 1 degree by 1 degree across the Western US. That’s an area of roughly 69 miles on each side, which is the typical spatial resolution of climate models. This approach allowed for targeted lightning projections at finer geographic scales than previous studies.

These neural-network based predictive models were the subject of a paper published in 2024, led by Kalashnikov and coauthored by Singh, among others.

In the current project, the team used data from three key meteorological variables conducive to lightning from the summers of 1995–2022 to train the network in each grid box to make mid-century projections.

“Instead of developing one model to predict lightning everywhere, we really went in on a finer scale to predict lightning at each 1-degree box,” Kalashnikov says.

The models identified days where cloud-to-ground lightning would be likely for each grid; researchers also quantified how many of these days are expected to be high fire-weather weather days, using the Fire Weather Index, a measure of wildfire risk based on weather and climate conditions. Critically, the authors found that most locations will experience an increased risk of lightning-caused fires due to increases in the Fire Weather Index, even in places where lightning occurrence might not increase.

An increase in lightning days does not result in a 1-to-1 increase in fire risk, however, because fire risk depends on other variables, such as temperature, rainfall or wind, and vegetation dryness. Across the Rockies, for example, the number of days with a high likelihood of lightning-caused fires is expected to grow by three or more days by the mid-21st century though the overall increase in lightning days is larger.

On the other hand, parts of Utah and Arizona showed a reduction in lightning days—but an increase in days of potential lightning-caused fires, due to higher wildfire risk in general.

The Southwest showed fewer projected increases in lightning days—and even declines in some areas—but the region is still expected to see a rise in days with a likelihood of wildfires ignited by lightning.

Kalashnikov led the project while completing his PhD at WSU and is now a post-doctoral fellow at the University of California, Merced. Coauthors included researchers from UC Merced, Colorado State University, Portland State University, and other institutions.

Source: Washington State University

The post Northwest US will face higher lightning and wildfire risk appeared first on Futurity.



from Futurity https://ift.tt/boDPKap

This mindset can boost mental health in midlife

A middle age man in a red shirt and glasses smiles while standing in front of a white background.

Middle-aged adults who adopt an attitude of joyful acceptance toward all of life’s experiences—both good and bad—enjoy better mental health, particularly when they feel socially connected, a new study suggests.

The research in The Humanistic Psychologist centers on the concept of
amor fati, a Latin phrase proclaimed by Friedrich Nietzsche more than 100 years
ago meaning “love of one’s fate.” It suggests that people who embrace amor fati are
more likely to flourish and less likely to languish in midlife.

The study, led by University of Michigan psychologist Edward Chang, surveyed 111 Americans ages 35 to 60 to explore how amor fati relates to mental health, social connectedness, and loneliness.

“Amor fati isn’t about passive acceptance,” says Chang, professor of psychology. “It’s a joyful, deliberate engagement with everything life throws at you, including suffering. My findings suggest this mindset can play a powerful role in helping middle-aged adults thrive.”

Midlife is often characterized by unique psychological stressors—career plateaus, the
demands of raising children and caring for aging parents, the death of loved ones, and
increased awareness of one’s own mortality. These pressures can lead to feelings of
isolation or emptiness.

To explore how amor fati might buffer against these effects, participants were asked to
respond to statements reflecting this attitude, as well as questions about their social
connectedness and loneliness, and their overall mental health—measured in terms of
flourishing and languishing.

Flourishing was defined as the presence of positive experiences in one’s life. Languishing, on the other hand, referred to the absence of such experiences.

The study found that people who scored higher on amor fati also reported feeling more
socially connected and less lonely—factors that were linked to greater flourishing. In
other words, those who embraced amor fati tended to feel more connected to others
and less lonely, which in turn further boosted their sense of flourishing.

Interestingly, the connection between amor fati and reduced languishing was partly
explained by increased social connectedness alone. Loneliness did not play the same
mediating role in that part of the model.

The study encourages a rethinking of how we experience difficult moments. For example, being alone doesn’t necessarily have to be seen as a negative state. Rather,
both solitude and companionship can be meaningful aspects of life when approached
with amor fati.

Similarly, middle-aged adults juggling the care of young children and elderly
parents—often referred to as the “sandwich generation”—might find relief in reframing
their responsibilities. Instead of viewing caregiving as a heavy burden, Chang suggests
recognizing it as a continuation of what previous generations endured.

“Caring is a choice,” Chang says. “And whether it’s for your children, your parents or yourself, these acts of care are deeply connected to personal growth and fulfillment.”

The study opens new doors for understanding mental health during midlife—a life stage
often overshadowed in psychological research. It also offers practical insight for those
seeking meaning and resilience amid life’s challenges.

Ultimately, the findings suggest that learning to embrace—not just endure—life’s full spectrum of experiences may be key to thriving as we age, Chang says.

Source: University of Michigan

The post This mindset can boost mental health in midlife appeared first on Futurity.



from Futurity https://ift.tt/EUMK805

Tuesday, August 26, 2025

Birth order may affect your mutual fund manager’s decisions

A young girl wearing askew sunglasses smiles for a picture as she holds her baby sibling.

New research reveals how childhood sibling dynamics may shape billion-dollar financial decisions.

Many people know nothing about the mutual fund manager whose investment decisions affect the performance of their IRA and 401(k) accounts.

But the new research suggests it’s well worth one’s while to know who’s minding your funds—specifically their birth order—due to implications for how your fund performs.

Finance professor Vikas Agarwal, of Georgia State University and his coauthors at the University of St. Gallen share their findings in a new paper in the Journal of Finance, including that a mutual fund manager’s birth order could be a strong predictor of their investment behavior.

The researchers found that managers who were born later among siblings take more financial and regulatory risks. And although one may expect bigger risks to come with bigger rewards, the data shows otherwise: later-born managers tend to underperform their firstborn peers on nearly every risk-adjusted performance metric.

Agarwal and his coauthors analyzed decades of data for more than 1,400 US-based mutual fund managers who manage nearly 1,800 mutual funds. The researchers combined performance statistics with detailed family background information, including birth rank, sibling age gaps, and even parental income and education, sourced from obituaries, public records like US Census data, and databases like Ancestry.com.

This is one of the first studies to use these types of data to quantify how birth order affects mutual fund management.

“We did not use just a standard database—we had to analyze many different data sources to get the level of family details on each manager, because what we really wanted to understand is the root cause of this risk-taking behavior by later-born siblings,” says Agarwal.

“What we found is that it boils down to sibling rivalry, as later-born children may develop sensation-seeking traits as a way to stand out in the family.”

The study’s results are both striking and statistically robust. A manager’s birth order was strongly correlated with several key indicators of risk-taking. For every step down in birth order (from firstborn to second-born, for example), total fund risk increased by 0.37 percentage points annually. Active risk, that is how much a fund’s returns deviate from its benchmark, rose by 0.65 percentage points.

Later-born managers were also more likely to invest in so-called lottery stocks—companies with low prices, high volatility, and a small chance of outsized returns—such as biotech startups or meme stocks. This gambling-style behavior may resonate with sensation-seeking personality types but is often out of sync with investor preferences for stable, long-term growth.

Perhaps most notably, later-born managers were more frequently cited for regulatory or civil violations, according to data from FINRA’s BrokerCheck records, including infractions like late disclosures, misreporting, and customer disputes.

Findings show this extra risk-taking did not translate into better performance. On measures like the Sharpe ratio, which is a widely used financial metric to evaluate the risk-adjusted return of an investment or portfolio, information ratio, and peer-adjusted alpha, later-born managers consistently underperformed. The researchers estimate that about half of this underperformance can be traced to their preference for lottery stocks.

To probe the mechanism behind this phenomenon, the authors examined whether certain family conditions made the birth order effect stronger. They found that the effect was amplified in families with fewer parental resources and narrower age gaps between siblings.

“Siblings closer in age are fighting for the same resources as children, so we found this competitiveness stronger when the age gaps are smaller and parental attention is limited due to single parent or dual working parent households,” says Agarwal.

With trillions of dollars parked in 401(k)s and IRAs, understanding the behavioral drivers of fund managers is crucial for investors, advisors, and regulators.

“This is important because it’s not the fund managers’ money—it’s the money of the investors. And if these fund managers are taking risky, gambling-type decisions with other people’s money, then it has implications for the welfare of the investors who are investing in this mutual fund,” says Agarwal.

While birth order obviously isn’t something asset managers disclose in their prospectuses, it adds a new dimension to the ongoing debate about what makes a good fund manager. Beyond credentials and track records, personality traits rooted in early childhood may influence how aggressively a manager plays the market.

So next time you’re choosing a fund, you might just wonder: Is the person behind the portfolio a cautious firstborn or a risk-hungry youngest sibling trying to make their mark?

Source: Georgia State

The post Birth order may affect your mutual fund manager’s decisions appeared first on Futurity.



from Futurity https://ift.tt/HNJ3Fh9

Too many pills in middle age may harm your strength and balance

Many different pills sit on a white background.

A new study suggests that taking multiple medications may be associated with how strong, mobile, and steady you feel even before old age sets in.

Researchers using the long-running CARDIA study looked at data from nearly 2,000 adults, average age 60, and found that nearly 1 in 3 were taking five or more medications—a threshold considered polypharmacy.

Those taking five or more prescriptions walked slower, had weaker grip strength, and showed worse balance than peers on fewer medications.

“What this tells us is that physical decline related to medication use may not be just a problem for people in their 70s or 80s,” says lead study author Caroline Sloan, an internist at Duke Health and expert in population health sciences. “It could start showing up in your 50s or early 60s, when people still expect to be independent and mobile.”

There’s strong evidence that taking five or more medications can worsen physical function and increase the risk of falls in older adults, partly due to drug interactions or overlapping health conditions.

But few studies have looked at these effects in middle-aged adults.

Sloan worked with Duke Health geriatrician and chronic disease expert Christopher Barrett Bowling on the study in the Journal of General Internal Medicine.

The researchers measured physical function using five standardized tests—including grip strength, gait speed, and a six-minute walk—and combined the results into a single score called the CARDIA Physical Performance (CAPP) score.

Those taking five or more medications scored, on average, 1.24 points lower on the 20-point CAPP scale than those not taking multiple medications—a gap that reflects a real and meaningful difference.

Results from an additional part of the study suggest that it’s not necessarily the type of drug that matters, but the sheer number.

The researchers examined the use of potentially inappropriate medications (PIMs), a category defined by the Beers Criteria as medications that should be avoided in older adults that includes heart and anxiety medicines along with other drugs. While 25% of participants were taking at least one PIM, use of these medications alone did not independently predict lower physical performance once polypharmacy was accounted for.

“We expected that the specific drugs considered inappropriate for older adults would have the biggest impact,” Sloan says. “But instead, it was really the number of medications that stood out.”

To be clear, the study doesn’t prove that taking multiple medications causes poor physical function. Polypharmacy is common in adults, and people may be taking more medications because of underlying health problems that also affect their strength and mobility.

Still, the association offers a potential red flag for clinicians, especially primary care doctors who may not routinely assess physical performance in patients under age 65.

“If I’m seeing a 58-year-old on 15 medications, that should prompt me to think about their physical function,” Sloan says. “They may benefit from a physical therapy referral or an exercise program and more broadly, it’s an opportunity to take a closer look at their medication list.”

The study found no significant differences in the impact of polypharmacy by sex or race.

However, women and Black participants were more likely to be on higher numbers of medications, highlighting potential disparities in how chronic conditions are managed in midlife.

The research adds to growing calls for more individualized prescribing and more attention to functional health—not just blood pressure and lab values—as people enter older age.

Funding for the work came from the National Institute on Aging and Duke Pepper Older Americans Independence Center.

Source: Duke University

The post Too many pills in middle age may harm your strength and balance appeared first on Futurity.



from Futurity https://ift.tt/rVlfRmy