by Kitty Broihier

Originally published on Guiding Stars Health & Nutrition News

Maybe you’ve never given your liver a second thought. But there’s a good chance you’ve seen social media posts and videos that have you wondering if yours is ok. Please don’t rush out and buy some random liver detox potion “just in case.” Instead, take a minute to read up on what you need to know about your amazing liver and how to keep this largest internal organ in good shape.

Meet Your Liver

Your liver is a workaholic. Out of the hundreds of jobs it does, the biggest is to filter your blood—about 22 gallons every hour. Its other functions support metabolism, immunity, digestion, and detoxification. Here are some examples of what your liver is up to day in and day out: 

  • Producing bile, which helps break down fats in the small intestine
  • Producing cholesterol and proteins that transport fats around the body
  • Storing excess glucose as glycogen for when that energy is needed
  • Regulating blood clotting
  • Storing fat-soluble vitamins
  • Metabolizing or detoxifying harmful substances from the body, such as drugs and toxins

Liver Disease Basics

As important to health as the liver is, it makes sense to take care of it as best you can. Of course, not all liver diseases are improved with a healthy diet. Some liver diseases are genetic or the result of an autoimmune condition, virus, or injury. However, fat accumulation in the liver (often referred to as a “fatty liver”) is related to lifestyle. Heavy alcohol consumption, for example, can cause one type of fatty liver. Another used to be called non-alcoholic fatty liver disease and is now known as Metabolic Dysfunction-Associated Steatotic Liver Disease (MASLD). High cholesterol, obesity, metabolic syndrome, and type 2 diabetes all link to MASLD.

MASLD is the most common liver disease worldwide, and that’s what we’re focusing on here. Eating a healthy diet and exercising doesn’t guarantee you won’t get liver disease. But taking steps toward prevention can help correct common characteristics among people with fatty liver, including insulin resistance, inflammation, elevated blood fats, and an imbalanced gut microbiome. 

What Is a Liver-Friendly Diet?

First, let me share what a liver-friendly eating plan isn’t. It isn’t something that relies on liver “cleanses,” “flushes,” or pricey supplements. A diet that’s healthy for your liver includes plenty of whole foods that support liver function. You’ll eat plenty of fruits and vegetables, nuts and seeds, whole grains, and seafood (especially those high in omega-3 fatty acids). And you’ll drink coffee and green tea, along with lots of water. What you’ll avoid is fast foods, highly processed foods, added sugars, and alcohol. If you recognize this as being a lot like the Mediterranean diet, you’re right!

Foods for a Healthy Liver

A nutrient-dense, plant-forward eating style supports liver function by helping to correct the conditions that contribute to fatty liver development. This type of diet works for liver health because it:

Provides plenty of polyphenolsthese antioxidants can help prevent MASLD by lowering inflammation, boosting insulin sensitivity, and decreasing fat accumulation. Good sources of polyphenols include:

  • Berries
  • Nuts and seeds
  • Dark chocolate/cacao
  • 100% whole grain wheat or rye
  • Dark, leafy greens
  • Dried herbs and spices
  • Coffee and green/black tea

Fights chronic inflammation — by supplying natural compounds that help dampen inflammatory responses while also limiting foods known to contribute to inflammation. An anti-inflammatory eating plan is rich in fiber and nutrients, while low in saturated fats, sugar, and red/processed meat.

Helps with weight management — getting to (and maintaining) a healthy weight is a main focus for preventing and treating MASLD. A Mediterranean eating style is a great choice of eating plan for liver support. It’s a healthy and sustainable way to lose weight. And it emphasizes many of the same foods that aid liver function, too.

Looking for Recipe Ideas?

If you’re feeling stuck for ideas about recipes that are suitable for liver health, check out the collection of Mediterranean-inspired recipes on the Guiding Stars website (enter “Mediterranean” into the search box). The Mediterranean Antipasto Tuna Salad is an easy and economical favorite of mine, and perfect for upcoming warmer weather.

About Guiding Stars

Guiding Stars is an objective, evidence-based, nutrition guidance program that evaluates foods and beverages to make nutritious choices simple. Products that meet transparent nutrition criteria earn a 1, 2, or 3 star rating for good, better, and best nutrition. Guiding Stars can be found in more than 2,000 grocery stores, in Circana’ Attribute Marketplace, and through the Guiding Stars Food Finder app.

*Mediterranean Chicken Salad – 2 Guiding Stars

by Kitty Broihier

Originally published on Guiding Stars Health & Nutrition News

Maybe you’ve never given your liver a second thought. But there’s a good chance you’ve seen social media posts and videos that have you wondering if yours is ok. Please don’t rush out and buy some random liver detox potion “just in case.” Instead, take a minute to read up on what you need to know about your amazing liver and how to keep this largest internal organ in good shape.

Meet Your Liver

Your liver is a workaholic. Out of the hundreds of jobs it does, the biggest is to filter your blood—about 22 gallons every hour. Its other functions support metabolism, immunity, digestion, and detoxification. Here are some examples of what your liver is up to day in and day out: 

  • Producing bile, which helps break down fats in the small intestine
  • Producing cholesterol and proteins that transport fats around the body
  • Storing excess glucose as glycogen for when that energy is needed
  • Regulating blood clotting
  • Storing fat-soluble vitamins
  • Metabolizing or detoxifying harmful substances from the body, such as drugs and toxins

Liver Disease Basics

As important to health as the liver is, it makes sense to take care of it as best you can. Of course, not all liver diseases are improved with a healthy diet. Some liver diseases are genetic or the result of an autoimmune condition, virus, or injury. However, fat accumulation in the liver (often referred to as a “fatty liver”) is related to lifestyle. Heavy alcohol consumption, for example, can cause one type of fatty liver. Another used to be called non-alcoholic fatty liver disease and is now known as Metabolic Dysfunction-Associated Steatotic Liver Disease (MASLD). High cholesterol, obesity, metabolic syndrome, and type 2 diabetes all link to MASLD.

MASLD is the most common liver disease worldwide, and that’s what we’re focusing on here. Eating a healthy diet and exercising doesn’t guarantee you won’t get liver disease. But taking steps toward prevention can help correct common characteristics among people with fatty liver, including insulin resistance, inflammation, elevated blood fats, and an imbalanced gut microbiome. 

What Is a Liver-Friendly Diet?

First, let me share what a liver-friendly eating plan isn’t. It isn’t something that relies on liver “cleanses,” “flushes,” or pricey supplements. A diet that’s healthy for your liver includes plenty of whole foods that support liver function. You’ll eat plenty of fruits and vegetables, nuts and seeds, whole grains, and seafood (especially those high in omega-3 fatty acids). And you’ll drink coffee and green tea, along with lots of water. What you’ll avoid is fast foods, highly processed foods, added sugars, and alcohol. If you recognize this as being a lot like the Mediterranean diet, you’re right!

Foods for a Healthy Liver

A nutrient-dense, plant-forward eating style supports liver function by helping to correct the conditions that contribute to fatty liver development. This type of diet works for liver health because it:

Provides plenty of polyphenolsthese antioxidants can help prevent MASLD by lowering inflammation, boosting insulin sensitivity, and decreasing fat accumulation. Good sources of polyphenols include:

  • Berries
  • Nuts and seeds
  • Dark chocolate/cacao
  • 100% whole grain wheat or rye
  • Dark, leafy greens
  • Dried herbs and spices
  • Coffee and green/black tea

Fights chronic inflammation — by supplying natural compounds that help dampen inflammatory responses while also limiting foods known to contribute to inflammation. An anti-inflammatory eating plan is rich in fiber and nutrients, while low in saturated fats, sugar, and red/processed meat.

Helps with weight management — getting to (and maintaining) a healthy weight is a main focus for preventing and treating MASLD. A Mediterranean eating style is a great choice of eating plan for liver support. It’s a healthy and sustainable way to lose weight. And it emphasizes many of the same foods that aid liver function, too.

Looking for Recipe Ideas?

If you’re feeling stuck for ideas about recipes that are suitable for liver health, check out the collection of Mediterranean-inspired recipes on the Guiding Stars website (enter “Mediterranean” into the search box). The Mediterranean Antipasto Tuna Salad is an easy and economical favorite of mine, and perfect for upcoming warmer weather.

About Guiding Stars

Guiding Stars is an objective, evidence-based, nutrition guidance program that evaluates foods and beverages to make nutritious choices simple. Products that meet transparent nutrition criteria earn a 1, 2, or 3 star rating for good, better, and best nutrition. Guiding Stars can be found in more than 2,000 grocery stores, in Circana’ Attribute Marketplace, and through the Guiding Stars Food Finder app.

*Mediterranean Chicken Salad – 2 Guiding Stars

New foundational technologies that change the world often require massive infrastructure to achieve scale and global adoption. Historically, innovation follows this path: Railroads, highways and global communications networks each enabled new modes of commerce and connection, but only after significant investments in infrastructure.

It’s no different for artificial intelligence.

This Earth Day, as AI infrastructure rapidly expands, one message is clear: Energy efficiency will largely shape the long-term impacts of this transformational technology. In the years ahead, the bottleneck for AI is becoming less about compute and more about real-world constraints on power, cooling, water and grid capacity. Global data center electricity demand is projected to more than double by 2030, reaching about 945 terawatt hours per year, about the current electricity consumption of Japan.

We are at a pivotal moment for technology companies, data center operators, policymakers and standard setting bodies to accelerate adoption of open standards, modular designs and system‑level efficiency. At AMD, we are committed to these principles and are laser-focused on maximizing compute performance per watt of energy, especially in the data center.

For more than a decade, we have set and achieved bold, public, time-bound goals that scale from chips to accelerated compute nodes to full server racks.i Our current goal is to deliver a 20x improvement in rack‑scale energy efficiency for AI training and inference between 2024 and 2030.ii What does that mean in practice? Training an average AI model in 2025 that may require several hundred server racks could require roughly one rack by 2030, using 95% less electricity and producing a fraction of the carbon emissions.iii

It’s an ambitious goal. And Earth Day is a fitting moment to explain how we plan to get there.

Start with Efficiency at the Core

In digital infrastructure, inefficiency compounds. Energy wasted at the processor ripples outward through the server, cluster, data center and, ultimately, the grid – driving additional demand for cooling, power conversion, redundancy and transmission, magnifying inefficiency across the ecosystem.

When chips and servers deliver more performance per watt, the benefits cascade across the entire system. At a time when demand for AI compute far exceeds supply, maximizing existing data center and grid infrastructure is imperative. This is also an area where AI itself can help.

At AMD we applied AI‑driven automation and analytics our own internal IT grid infrastructure, reducing operational and maintenance costs by an average of 20% to 25%.iv By using AI to predict demand, optimizing utilization in real time and automating issue resolution through intelligent workflows and chatbots, we shifted from reactive infrastructure management to a more adaptive, self‑optimizing model. The effort demonstrates how AI can unlock efficiency gains in products and across digital infrastructure itself.

Scale Through Modular and Open Design

Unlocking the next step‑change in efficiency increasingly depends on deep industry collaboration and transparency across hardware, software and systems integration. Design choices at the chip and rack level directly affect cooling, power distribution, facility design and grid demand. When these elements are optimized in isolation or locked into proprietary systems, the ecosystem bears the cost.

That is why AMD is committed to an open ecosystem. Industry alignment around open standards and interoperable designs allows innovation to scale rapidly, deployments to accelerate and energy efficiency gains to compound.

We are proud to hold leadership roles in the Open Compute Project, where companies share design specifications to enable interoperable systems, modular building blocks and component-level upgrades to extend lifecycles. We also help lead organizations like The Green Grid to advance common definitions and system-level thinking on energy and water efficiency, supporting more consistent design choices, benchmarking and transparency across the data center ecosystem. In software, we support open-sourced development through the AMD ROCm™ platform. We believe continued innovation at the software and AI-model level will act as a powerful force multiplier, amplifying our 20x energy-efficiency goal by up to fivefold and together enabling a potential 100x increase in the energy efficiency of AI training by 2030.v

Manage Resources Across the Life Cycle

Modern AI server racks contain tens of thousands of components, weigh several thousand pounds and can embody tons of carbon emissions before they are ever powered on. Transporting, decommissioning and recycling introduce additional costs and emissions over the hardware life cycle. Servers that operate more efficiently and for longer deliver more useful compute within real-world economic and environmental constraints.

Viewed through a circular economy lens, responsible life cycle management prioritizes extracting the greatest practical value from every component across every stage of their life cycle. It starts with modular, open designs that enable interoperable, repairable and upgradable systems. It includes extending system lifetimes, adapting to changing workloads and minimizing waste. And when systems reach end-of-use, component recovery and high-value recycling can return materials to the supply chain while helping create space for new generations of more energy-efficient servers.

These practices can allow organizations to consolidate equipment and reduce energy use or increase compute performance without expanding their physical footprint. This can deliver total cost of ownership benefits that include electricity and carbon emissions (Scope 2), while reducing or deferring upstream and downstream value chain emissions (Scope 3). Managing resources across the life cycle helps defer both financial and environmental costs.

Design for Sustainability

Taken together, efficiency at the compute layer, openness at the system level and responsibility across the full life cycle form a powerful flywheel. Performance-per-watt gains cascade across data centers, value chains and the grid.

As AI continues to scale, it is increasingly evident that innovation and sustainability are most effective when they are intentionally designed to move together.

Footnotes


[i] Statement based on AMD setting its 25×20 goal in 2014 and demonstrating public energy efficiency goals to the current period in 2026: https://ir.amd.com/news-events/press-releases/detail/957/amd-exceeds-six-year-goal-to-deliver-unprecedented-25-times-improvement-in-mobile-processor-energy-efficiency; https://www.amd.com/en/corporate/corporate-responsibility/data-center-sustainability.html

[ii] AMD based advanced racks for AI training/inference in each year (2024 to 2030) based on AMD roadmaps, also examining historical trends to inform rack design choices and technology improvements to align projected goals and historical trends. The 2024 rack is based on the MI300X node, which is comparable to the Nvidia H100 and reflects current common practice in AI deployments in 2024/2025 timeframe. The 2030 rack is based on an AMD system and silicon design expectations for that time frame. In each case, AMD specified components like GPUs, CPUs, DRAM, storage, cooling, and communications, tracking component and defined rack characteristics for power and performance. Calculations do not include power used for cooling air or water supply outside the racks but do include power for fans and pumps internal to the racks.

  FLOPS HBM BW Scale-up BW
Training 70.0% 10.0% 20.0%
Inference 45.0% 32.5% 22.5%

Performance and power use per rack together imply trends in performance per watt over time for training and inference, then indices for progress in training and inference are weighted 50:50 to get the final estimate of AMD projected progress by 2030 (20x). The performance number assumes continued AI model progress in exploiting lower precision math formats for both training and inference which results in both an increase in effective FLOPS and a reduction in required bandwidth per FLOP.

We commissioned Dr. Koomey to analyze historical industry data and projected AMD data on compute performance and power consumption. We then worked with Dr. Koomey to develop a goal methodology aligned with industry-accepted best-practices for efficiency assessments. This methodology allows us to compare our goal to historical industry gains, track our progress against the goal over time, and to estimate environmental benefits of achieving the goal in real world AI deployment.

[iii] AMD estimated the number of racks to train a typical notable AI model based on EPOCH AI data (https://epoch.ai). For this calculation we assume, based on these data, that a typical model takes 1025 floating point operations to train (based on the median of 2025 data), and that this training takes place over 1 month. FLOPs needed = 10^25 FLOPs/(seconds/month)/Model FLOPs utilization (MFU) = 10^25/(2.6298*10^6)/0.6. Racks = FLOPs needed/(FLOPS/rack in 2024 and 2030). The compute performance estimates from the AMD roadmap suggests that approximately 276 racks would be needed in 2025 to train a typical model over one month using the MI300X product (assuming 22.656 PFLOPS/rack with 60% MFU) and <1 fully utilized rack would be needed to train the same model in 2030 using a rack configuration based on an AMD roadmap projection. These calculations imply a >276-fold reduction in the number of racks to train the same model over this six-year period. Electricity use for a MI300X system to completely train a defined 2025 AI model using a 2024 rack is calculated at ~7GWh, whereas the future 2030 AMD system could train the same model using ~350 MWh, a 95% reduction. AMD then applied carbon intensities per kWh from the International Energy Agency World Energy Outlook 2024 [https://www.iea.org/reports/world-energy-outlook-2024]. IEA’s stated policy case gives carbon intensities for 2023 and 2030. We determined the average annual change in intensity from 2023 to 2030 and applied that to the 2023 intensity to get 2024 intensity (434 CO2 g/kWh) versus the 2030 intensity (312 CO2 g/kWh). Emissions for the 2024 baseline scenario of 7 GWh x 434 CO2 g/kWh equates to approximately 3000 metric tC02, versus the future 2030 scenario of 350 MWh x 312 CO2 g/kWh equates to around100 metric tCO2.

[iv]https://www.amd.com/en/blogs/2026/enhancing-amd-it-grid-infrastructure-efficiency-with-ai.html

[v] Regression analysis of achieved accuracy/parameter across a selection of model benchmarks, such as MMLU, HellaSwag, and ARC Challenge, show that improving efficiency of ML model architectures through novel algorithmic techniques, such as Mixture of Experts and State Space Models for example, can improve their efficiency by roughly 5x during the goal period. Similar numbers are quoted in Patterson, D., J. Gonzalez, U. Hölzle, Q. Le, C. Liang, L. M. Munguia, D. Rothchild, D. R. So, M. Texier, and J. Dean. 2022. “The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink.” Computer. vol. 55, no. 7. pp. 18-28.” Therefore, assuming innovation continues at the current pace, a 20x hardware and system design goal amplified by a 5x software and algorithm advancements can lead to a 100x total gain by 2030.

New foundational technologies that change the world often require massive infrastructure to achieve scale and global adoption. Historically, innovation follows this path: Railroads, highways and global communications networks each enabled new modes of commerce and connection, but only after significant investments in infrastructure.

It’s no different for artificial intelligence.

This Earth Day, as AI infrastructure rapidly expands, one message is clear: Energy efficiency will largely shape the long-term impacts of this transformational technology. In the years ahead, the bottleneck for AI is becoming less about compute and more about real-world constraints on power, cooling, water and grid capacity. Global data center electricity demand is projected to more than double by 2030, reaching about 945 terawatt hours per year, about the current electricity consumption of Japan.

We are at a pivotal moment for technology companies, data center operators, policymakers and standard setting bodies to accelerate adoption of open standards, modular designs and system‑level efficiency. At AMD, we are committed to these principles and are laser-focused on maximizing compute performance per watt of energy, especially in the data center.

For more than a decade, we have set and achieved bold, public, time-bound goals that scale from chips to accelerated compute nodes to full server racks.i Our current goal is to deliver a 20x improvement in rack‑scale energy efficiency for AI training and inference between 2024 and 2030.ii What does that mean in practice? Training an average AI model in 2025 that may require several hundred server racks could require roughly one rack by 2030, using 95% less electricity and producing a fraction of the carbon emissions.iii

It’s an ambitious goal. And Earth Day is a fitting moment to explain how we plan to get there.

Start with Efficiency at the Core

In digital infrastructure, inefficiency compounds. Energy wasted at the processor ripples outward through the server, cluster, data center and, ultimately, the grid – driving additional demand for cooling, power conversion, redundancy and transmission, magnifying inefficiency across the ecosystem.

When chips and servers deliver more performance per watt, the benefits cascade across the entire system. At a time when demand for AI compute far exceeds supply, maximizing existing data center and grid infrastructure is imperative. This is also an area where AI itself can help.

At AMD we applied AI‑driven automation and analytics our own internal IT grid infrastructure, reducing operational and maintenance costs by an average of 20% to 25%.iv By using AI to predict demand, optimizing utilization in real time and automating issue resolution through intelligent workflows and chatbots, we shifted from reactive infrastructure management to a more adaptive, self‑optimizing model. The effort demonstrates how AI can unlock efficiency gains in products and across digital infrastructure itself.

Scale Through Modular and Open Design

Unlocking the next step‑change in efficiency increasingly depends on deep industry collaboration and transparency across hardware, software and systems integration. Design choices at the chip and rack level directly affect cooling, power distribution, facility design and grid demand. When these elements are optimized in isolation or locked into proprietary systems, the ecosystem bears the cost.

That is why AMD is committed to an open ecosystem. Industry alignment around open standards and interoperable designs allows innovation to scale rapidly, deployments to accelerate and energy efficiency gains to compound.

We are proud to hold leadership roles in the Open Compute Project, where companies share design specifications to enable interoperable systems, modular building blocks and component-level upgrades to extend lifecycles. We also help lead organizations like The Green Grid to advance common definitions and system-level thinking on energy and water efficiency, supporting more consistent design choices, benchmarking and transparency across the data center ecosystem. In software, we support open-sourced development through the AMD ROCm™ platform. We believe continued innovation at the software and AI-model level will act as a powerful force multiplier, amplifying our 20x energy-efficiency goal by up to fivefold and together enabling a potential 100x increase in the energy efficiency of AI training by 2030.v

Manage Resources Across the Life Cycle

Modern AI server racks contain tens of thousands of components, weigh several thousand pounds and can embody tons of carbon emissions before they are ever powered on. Transporting, decommissioning and recycling introduce additional costs and emissions over the hardware life cycle. Servers that operate more efficiently and for longer deliver more useful compute within real-world economic and environmental constraints.

Viewed through a circular economy lens, responsible life cycle management prioritizes extracting the greatest practical value from every component across every stage of their life cycle. It starts with modular, open designs that enable interoperable, repairable and upgradable systems. It includes extending system lifetimes, adapting to changing workloads and minimizing waste. And when systems reach end-of-use, component recovery and high-value recycling can return materials to the supply chain while helping create space for new generations of more energy-efficient servers.

These practices can allow organizations to consolidate equipment and reduce energy use or increase compute performance without expanding their physical footprint. This can deliver total cost of ownership benefits that include electricity and carbon emissions (Scope 2), while reducing or deferring upstream and downstream value chain emissions (Scope 3). Managing resources across the life cycle helps defer both financial and environmental costs.

Design for Sustainability

Taken together, efficiency at the compute layer, openness at the system level and responsibility across the full life cycle form a powerful flywheel. Performance-per-watt gains cascade across data centers, value chains and the grid.

As AI continues to scale, it is increasingly evident that innovation and sustainability are most effective when they are intentionally designed to move together.

Footnotes


[i] Statement based on AMD setting its 25×20 goal in 2014 and demonstrating public energy efficiency goals to the current period in 2026: https://ir.amd.com/news-events/press-releases/detail/957/amd-exceeds-six-year-goal-to-deliver-unprecedented-25-times-improvement-in-mobile-processor-energy-efficiency; https://www.amd.com/en/corporate/corporate-responsibility/data-center-sustainability.html

[ii] AMD based advanced racks for AI training/inference in each year (2024 to 2030) based on AMD roadmaps, also examining historical trends to inform rack design choices and technology improvements to align projected goals and historical trends. The 2024 rack is based on the MI300X node, which is comparable to the Nvidia H100 and reflects current common practice in AI deployments in 2024/2025 timeframe. The 2030 rack is based on an AMD system and silicon design expectations for that time frame. In each case, AMD specified components like GPUs, CPUs, DRAM, storage, cooling, and communications, tracking component and defined rack characteristics for power and performance. Calculations do not include power used for cooling air or water supply outside the racks but do include power for fans and pumps internal to the racks.

  FLOPS HBM BW Scale-up BW
Training 70.0% 10.0% 20.0%
Inference 45.0% 32.5% 22.5%

Performance and power use per rack together imply trends in performance per watt over time for training and inference, then indices for progress in training and inference are weighted 50:50 to get the final estimate of AMD projected progress by 2030 (20x). The performance number assumes continued AI model progress in exploiting lower precision math formats for both training and inference which results in both an increase in effective FLOPS and a reduction in required bandwidth per FLOP.

We commissioned Dr. Koomey to analyze historical industry data and projected AMD data on compute performance and power consumption. We then worked with Dr. Koomey to develop a goal methodology aligned with industry-accepted best-practices for efficiency assessments. This methodology allows us to compare our goal to historical industry gains, track our progress against the goal over time, and to estimate environmental benefits of achieving the goal in real world AI deployment.

[iii] AMD estimated the number of racks to train a typical notable AI model based on EPOCH AI data (https://epoch.ai). For this calculation we assume, based on these data, that a typical model takes 1025 floating point operations to train (based on the median of 2025 data), and that this training takes place over 1 month. FLOPs needed = 10^25 FLOPs/(seconds/month)/Model FLOPs utilization (MFU) = 10^25/(2.6298*10^6)/0.6. Racks = FLOPs needed/(FLOPS/rack in 2024 and 2030). The compute performance estimates from the AMD roadmap suggests that approximately 276 racks would be needed in 2025 to train a typical model over one month using the MI300X product (assuming 22.656 PFLOPS/rack with 60% MFU) and <1 fully utilized rack would be needed to train the same model in 2030 using a rack configuration based on an AMD roadmap projection. These calculations imply a >276-fold reduction in the number of racks to train the same model over this six-year period. Electricity use for a MI300X system to completely train a defined 2025 AI model using a 2024 rack is calculated at ~7GWh, whereas the future 2030 AMD system could train the same model using ~350 MWh, a 95% reduction. AMD then applied carbon intensities per kWh from the International Energy Agency World Energy Outlook 2024 [https://www.iea.org/reports/world-energy-outlook-2024]. IEA’s stated policy case gives carbon intensities for 2023 and 2030. We determined the average annual change in intensity from 2023 to 2030 and applied that to the 2023 intensity to get 2024 intensity (434 CO2 g/kWh) versus the 2030 intensity (312 CO2 g/kWh). Emissions for the 2024 baseline scenario of 7 GWh x 434 CO2 g/kWh equates to approximately 3000 metric tC02, versus the future 2030 scenario of 350 MWh x 312 CO2 g/kWh equates to around100 metric tCO2.

[iv]https://www.amd.com/en/blogs/2026/enhancing-amd-it-grid-infrastructure-efficiency-with-ai.html

[v] Regression analysis of achieved accuracy/parameter across a selection of model benchmarks, such as MMLU, HellaSwag, and ARC Challenge, show that improving efficiency of ML model architectures through novel algorithmic techniques, such as Mixture of Experts and State Space Models for example, can improve their efficiency by roughly 5x during the goal period. Similar numbers are quoted in Patterson, D., J. Gonzalez, U. Hölzle, Q. Le, C. Liang, L. M. Munguia, D. Rothchild, D. R. So, M. Texier, and J. Dean. 2022. “The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink.” Computer. vol. 55, no. 7. pp. 18-28.” Therefore, assuming innovation continues at the current pace, a 20x hardware and system design goal amplified by a 5x software and algorithm advancements can lead to a 100x total gain by 2030.

As the official bank of the Buffalo Sabres, KeyBank partnered with the team to host three community playoff pep rallies celebrating the team’s first playoff appearance in 14 years. Through the HocKey Assists program, Sabres alumni, Sabretooth, and KeyBank and Sabres teammates visited Best Buddies, The Resource Council of WNY, and GiGi’s Playhouse to bring the excitement of playoff hockey directly to children and families across Western New York.

Each stop featured mini pep rallies, exclusive Sabres playoff giveaways, and opportunities for kids and families to interact with Sabres alumni who were part of the team’s last playoff run. Participants joined in on chants, games, and activities designed to make playoff excitement accessible and inclusive for everyone.

For KeyBank, these pep rallies reflected the heart of HocKey Assists: using the power of partnership to meet people where they are and create experiences that feel personal, inclusive, and uplifting. By bringing the playoff moment directly into nonprofit spaces, the program helped ensure that the city’s excitement was something everyone could take part in.

KeyBank is proud to stand alongside the Buffalo Sabres and our dedicated nonprofit partners through HocKey Assists, using the energy of the playoffs to connect, celebrate, and give back across Western New York. As the city rallies around our home team, KeyBank is honored to help share this historic moment with the communities that make Buffalo such a special place to call home.

Someone holding up a "we're back" banner

Sabres mascot visiting kids

Four people standing together, one holding up a "we're back" banner

A group holding up "we're back" banners, alongside Sabres mascot

A larger group posing with "we're back" banners in front of a mural that says "Welcome to THE PACK"

Four people posing with mascot

3 people standing together

A fan high-fiving Sabres mascot

Playing hockey indoors

Sabres mascot playing hocket

As the official bank of the Buffalo Sabres, KeyBank partnered with the team to host three community playoff pep rallies celebrating the team’s first playoff appearance in 14 years. Through the HocKey Assists program, Sabres alumni, Sabretooth, and KeyBank and Sabres teammates visited Best Buddies, The Resource Council of WNY, and GiGi’s Playhouse to bring the excitement of playoff hockey directly to children and families across Western New York.

Each stop featured mini pep rallies, exclusive Sabres playoff giveaways, and opportunities for kids and families to interact with Sabres alumni who were part of the team’s last playoff run. Participants joined in on chants, games, and activities designed to make playoff excitement accessible and inclusive for everyone.

For KeyBank, these pep rallies reflected the heart of HocKey Assists: using the power of partnership to meet people where they are and create experiences that feel personal, inclusive, and uplifting. By bringing the playoff moment directly into nonprofit spaces, the program helped ensure that the city’s excitement was something everyone could take part in.

KeyBank is proud to stand alongside the Buffalo Sabres and our dedicated nonprofit partners through HocKey Assists, using the energy of the playoffs to connect, celebrate, and give back across Western New York. As the city rallies around our home team, KeyBank is honored to help share this historic moment with the communities that make Buffalo such a special place to call home.

Someone holding up a "we're back" banner

Sabres mascot visiting kids

Four people standing together, one holding up a "we're back" banner

A group holding up "we're back" banners, alongside Sabres mascot

A larger group posing with "we're back" banners in front of a mural that says "Welcome to THE PACK"

Four people posing with mascot

3 people standing together

A fan high-fiving Sabres mascot

Playing hockey indoors

Sabres mascot playing hocket

Bored No More Charity Launches “Pet Packs” for Animal Shelters and a New Donor App — Just in Time for National Volunteer Week, Adopt a Shelter Pet Day, and National Rescue Dog Day

PHILADELPHIA, April 22, 2026 /PRNewswire/ — Bored No More charity (www.BoredNoMoreCharity.org), the youth-founded nonprofit led by Philadelphia teen sisters Taylor Brady (17) and Alexis Brady (18), is a 2026 Silver Anvil Finalist in the Silver Anvil Strategic Award for Nonprofits from the Public Relations Society of America (PRSA) — one of the most prestigious honors in the field— while simultaneously expanding its mission to support animal shelters nationwide.

Since its founding in 2023, Bored No More has delivered health and wellness activity packages reaching more than 3,500 hospitalized children and their families. The nonprofit announces two new initiatives: Bored No More Pet Packs, a program that provides enrichment supplies and adoption gear for dogs and cats to animal shelters across the U.S., and the Bored No More WishLink App, a free donor tool that connects supporters directly to Amazon Wishlists for children’s hospitals and animal rescues.

PRSA Silver Anvil Nomination

Bored No More is a 2026 Silver Anvil Finalist in the Community Relations category for Associations and Nonprofit Organizations, recognized for its project: Bored No More: A Vital Teen-Led Health & Wellness Initiative to Improve Pediatric Hospitalization Experiences through Community Outreach. Winners will be announced May 14, 2026, at the 2026 Anvil Awards ceremony in New York City.

“Bored No More Pet Packs”: Helping Shelter Animals Find Their People

Launched in early 2026, Pet Packs grew out of co-founder Taylor Brady’s personal connection to animal rescue — the family adopted their two dogs, Toby and Bear, through a Miami rescue.

“I want other pets to have a more comfortable shelter experience, with every opportunity to get adopted by wonderful families,” said Taylor Brady. “We work directly with shelters to figure out exactly what they need.”

In its first three months, the program has already donated to shelters in Shorter, Alabama; San Francisco, and Los Angeles, California; Miami, Fort Lauderdale, and Palm Beach, Florida; Maui, Hawaii; Hampton Bays and New York City, New York; Philadelphia, Pennsylvania; and Carlisle, South Carolina. Donations have included individual items as well as shareable ones that can be enjoyed repeatedly, including calming donut beds, fun flavored chew toys, “Adopt Me” leashes and leash wraps, cat playpens, scratch posts, interactive wands, joint supplements, and more — supporting many of these 10,000 dogs and cats at these locations.

The response from shelter staff has been enthusiastic. “I am thrilled that these two young community leaders (Taylor and Alexis Brady of Bored No More) have expanded their charitable mission and impact to include animals in need. Support like this goes far beyond the items themselves,” said Nichole Brophy, Operations Manager at Street Tails Animal Rescue in Philadelphia. “It helps our dogs feel more comfortable, build confidence, and be seen — and that’s what ultimately connects them with the right families.”

The Bored No More WishLink App: Connecting Donors to the Causes They Care About

Taylor Brady also personally developed the WishLink App, a free, searchable directory that links donors directly to Amazon Wishlists of pediatric hospitals and animal shelters. It is easy to use, and people can search by name, city, or state — removing the guesswork from giving.

“I wanted to make it as simple as possible for donors to support causes close to their hearts — or close to their homes,” said Taylor. The app is available at Bored No More Charity WishLink and is provided on the Impact section of the Bored No More website.

Timely Observances

  • National Volunteer Week: April 19–25, 2026
  • Adopt a Shelter Pet Day: April 30, 2026
  • National Rescue Dog Day: May 20, 2026

About Bored No More Charity

Founded in 2023 by Philadelphia sisters Taylor Brady (17) and Alexis Brady (18), Bored No More delivers Kids Care activity bags to hospitalized children — filled with Pop Sockets, fidget spinners, mini Etch-A-Sketches, Matchbox cars, coloring books, Crayola crayons, UNO cards, and more — shown to reduce anxiety in pediatric patients. The new Pet Packs program was inspired by the sisters’ own rescue dogs Toby and Bear, and their (non-rescue) hypoallergenic cats Turtle and Oreo.

The charity is supported by AT&T, Drive David (David Auto), Veronica Beard, Kendra Scott, The Do Gooders Philadelphia, Karma For Cara, ILYAN jewelry, artist Jessie Tristan Read, and Charity featured in Forbes, Yahoo! Creators, Main Line Today (“Top Young Innovators”), ABC 6 Philadelphia, and CBSNews.com.

Learn more: www.BoredNoMoreCharity.org & Instagram: @charityborednomore

New APP: Bored No More Charity WishLink

High-Res Images Here:

Interviews with Taylor and Alexis Brady are available upon request. More photos available upon request.

Contact: Allison Weiss Brady/Simplified Media Agency/ 412572@email4pr.com / 305-968-2323

Michele Iapicco/Simplified Media Agency/ 412572@email4pr.com / 210-560-8250

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/philadelphia-teen-sisters-earn-prestigious-prsa-silver-anvil-finalist-while-expanding-charity-to-help-shelter-animals-302749576.html

SOURCE Bored No More charity

Exclusive partnership brings AI-powered camera control, live streaming, and game-day moments into a single platform for sports organizations, teams, and families

CHICAGO, April 22, 2026 /PRNewswire/ — TeamSnap, the #1 youth sports management platform, and XbotGo, a leader in AI-powered sports cameras, today announced an exclusive partnership to redefine live streaming in youth sports. Together, the companies are introducing the first fully integrated, all-in-one live streaming experience, bringing camera control, live video, and game-day moments into TeamSnap ONE.

XbotGo is the leading innovator in AI-powered sports video, making broadcast-quality, automated game capture accessible to every team and family. The partnership brings live streaming into the same app teams already use to manage schedules, communication, and game-day coordination. By combining TeamSnap’s scale and reach with XbotGo’s Falcon AI-powered camera technology, the companies are making it easier for coaches, parents, and organizations to capture, share, and relive the moments that matter most.

“We’re focused on making game day simpler and more connected for the entire youth sports community,” said Peter Frintzilas, CEO of TeamSnap. “Families want an easy way to capture and share the moments that matter. Coaches want less friction on game day. Organizations want technology that fits naturally into how they already run their programs. By partnering exclusively with XbotGo, we’re bringing live streaming into TeamSnap ONE and taking another step toward a more complete season experience.”

Today, streaming in youth sports often means piecing together multiple technologies for team operations, camera setup, live video, and sharing. This fragmented approach creates added complexity on game day and disconnects video from the rest of the team experience.

TeamSnap and XbotGo remove that friction with one connected experience inside TeamSnap ONE. The same app teams use to manage their season now powers camera setup, live streaming, and one-tap highlights. Combined with TeamSnap’s platform, which powers registration, schedules, communication, and game-day coordination, the partnership extends that experience into video, connecting everything from signup to game-day moments in one system.

Powered by XbotGo’s Falcon camera, the experience combines AI-powered tracking, robotic zoom, and high-quality 4K HD video, all built directly into TeamSnap ONE. By integrating advanced camera technology into the platform families and organizations already trust, the partnership makes live streaming a natural part of the season — not a separate workflow on game day.

“As a sports parent myself, I know the struggle of missing a goal while fiddling with a phone. XbotGo was built to make sports video smarter, simpler, and more accessible,” said David Tan, Founder and CEO at XbotGo. “By integrating directly with TeamSnap ONE, we’re bringing that experience directly into the platform sports organizations, teams, and families already rely on every day. This partnership makes it easier to stream every game, capture every moment, and stay connected from anywhere.”

The announcement marks the beginning of a broader evolution in how youth sports communities capture, share, and experience the season. Together, TeamSnap and XbotGo are building a more connected future where team operations, live streaming, and season memories live in one place.

To learn more about the TeamSnap and XbotGo partnership and sign up for our early innovators program, visit https://info.teamsnap.com/xbotgo.html.

About TeamSnap
TeamSnap – the #1 sports management platform – has pioneered the future of youth sports technology for more than 15 years. With 19,000+ sports organizations and more than 30 million parents, players, coaches and administrators across more than 100 different sports and activities, TeamSnap powers the largest and most engaged online community in youth sports. Through TeamSnap, brands have invested more than $20 million in youth sports sponsorships, fueling communities and giving more kids the chance to play.
For more information, visit the TeamSnap website, and follow the company on LinkedIn, Instagram, Facebook and YouTube.

About XbotGo
XbotGo is the consumer AI brand of Blink Tech, Inc., founded by Dr. David Tan — a computer vision expert, hardware innovator, and passionate soccer dad. Driven by the belief that everyone should be able to capture and relive their best sports moments, XbotGo makes pro-level videography easy and accessible for all. With a core team of graduates from globally renowned institutions and veterans from leading tech companies, XbotGo operates in Silicon Valley, Texas, Beijing, Shenzhen, and Suzhou, leveraging global R&D and supply chain advantages to drive innovation worldwide.

Media Contacts:
TeamSnap
Alexandra Shafer
JConnelly for TeamSnap
teamsnap@jconnelly.com 
973-934-5100

XbotGo
Gabriel Roxas
gabrielroxas@xbotgo.com
469-922-3416

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/teamsnap-and-xbotgo-partner-to-redefine-youth-sports-streaming-with-industry-first-fully-integrated-ai-powered-experience-302749882.html

SOURCE TeamSnap

  • Eaton motor analytics helps predict motor and pump issues up to 30% earlier and 25% more accurately than currently available sensing solutions
  • Available as an add-on to Brightlayer on-premise software, motor analytics enables smarter, more cost-effective and predictive maintenance 

PITTSBURGH, April 22, 2026 /3BL/ – Intelligent power management company Eaton today introduced a new motor analytics software solution designed to help mining, oil and gas, manufacturing, and other industrial applications detect critical equipment issues sooner and more accurately than existing solutions – without installing additional sensors or hardware directly on the motor. By providing a comprehensive view of motor and pump health and performance, the predictive maintenance innovation empowers facility management teams to make smarter and more cost-effective maintenance decisions that drive uptime and energy efficiency.

The Eaton motor analytics utilizes motor current signature analysis (MCSA) and machine learning to predict the most common motor and pump problems months in advance. This predictive insight enables maintenance teams to prioritize interventions, minimize unplanned downtime and reduce operational risk. The solution also helps identify inefficient motors to prioritize maintenance efforts for optimized energy consumption and improved system performance.

“Electric motors are the heartbeat of mission-critical operations across nearly every industry, powering everything from pumps and compressors to conveyors and fans,” said Kevin Olikara, product and portfolio software manager at Eaton. “Our latest predictive maintenance innovation gives maintenance teams the actionable insights needed to help reduce unplanned downtime and energy waste while extending equipment life.”

Available as an add-on for any Eaton Brightlayer software for industrial customers, motor analytics features pre-built analytics and inferential sensing that can deliver valuable operational insights within hours. Compared to traditional sensor-based solutions, the Eaton approach requires less frequent maintenance and generates fewer false alarms.

To learn more, visit Eaton.com/MotorAnalytics.

Eaton is an intelligent power management company dedicated to protecting the environment and improving the quality of life for people everywhere. We make products for the data center, utility, industrial, commercial, machine building, residential, aerospace and mobility markets. We are guided by our commitment to do business right, to operate sustainably and to help our customers manage power ─ today and well into the future. By capitalizing on the global growth trends of electrification and digitalization, we’re helping to solve the world’s most urgent power management challenges and building a more sustainable society for people today and generations to come.

Founded in 1911, Eaton has continuously evolved to meet the changing and expanding needs of our stakeholders. With revenues of $27.4 billion in 2025, the company serves customers in 180 countries. For more information, visit www.eaton.com. Follow us on LinkedIn.

Contact:

Regina Parundik
+1.412.559.1614
reginaparundik@eaton.com

###

Key points

  • Coram CVS Specialty Infusion Services transitioned to recyclable, compostable packaging for temperature-sensitive medications — replacing bulky legacy packaging with wood and paper-based insulation.
  • The new innovative packaging is easier to handle, more compact, and designed to improve the patient experience, especially for those with mobility challenges.
  • This change supports our enterprise-wide sustainability objectives while maintaining the same high standards for safety, reliability, care and exceptional patient experiences.

Originally published on CVS Health Company Newsroom

Coram CVS Specialty Infusion Services launched a new innovative packaging solution that supports our enterprise-wide sustainability objectives and that has been tested to perform better than current packaging. Patients have already begun receiving their temperature-sensitive medications in recyclable, paper-based packaging.

  • The new packaging replaces expanded polystyrene (EPS), commonly known as Styrofoam.

Why Coram’s sustainable packaging matters

The transition reflects our commitment to sustainability objectives and patient-centered care. The new packaging is expected to lead to increased efficiency across our business operations, logistics and workflow.

It will also help optimize the use of storage space, deliver a better experience for patients and reduce plastic use and waste. Coram provides specialty infusion and nutrition therapies for patients with chronic and complex conditions. Before this, the life-sustaining medications for these patients were typically sent in EPS coolers and packaging that are bulky and difficult for patients to maneuver, dispose of, store or recycle. The legacy materials can take many years to break down in landfills.

  • Patient feedback had indicated that the old packaging wasn’t delivering the high-quality experience they expect from Coram.

“By replacing difficult-to-recycle materials with compostable and recyclable alternatives, we’re reducing waste, improving the patient experience, and removing thousands of pounds of plastic each year — a win for both patients and the planet,” says Jenny McColloch, Chief Sustainability Officer and VP, Community Impact.

  • “This packaging innovation reflects our commitment to embedding sustainability across our business and advancing our Healthy 2030 impact strategy.”

How the new packaging improves patient experience

The new packaging has been rigorously tested to meet — and in many cases exceed — performance standards for temperature control and durability.

It’s compact, easier to break down, and can be recycled or composted, making it more accessible and convenient for patients, including those with mobility limitations. Much of the new packaging uses a “nested box” approach, which helps keep the delivery consolidated to a single box.

  • Patients will continue to receive their medications safely and reliably. There are no changes to medication quality, delivery timelines, or pharmacy and nursing support.

What patients are saying: “The new packaging is much easier for me to manage physically. And discarding it is more convenient — and less stressful,” notes Beth Krom, a 70-year-old Coram patient in rural upstate New York and an early recipient of the new packaging.

  • “Before, the boxes were really heavy — like 50 pounds — and packed with multiple ice packs and yards of bubble wrap. It was a lot to deal with every month. I only have one small garbage can permitted for the week, so breaking everything down took a lot of time and effort,” she says. “Now, the boxes are really light!”

Beth is also considering using the compostable material in her garden.

Next steps for Coram and CVS Health

Coram has implemented this packaging innovation, starting with pharmacies in Mendota Heights, Minnesota, Malvern, Pennsylvania and San Diego.

“We’re exploring opportunities to extend these learnings to other segments — and build on the success of the integrated, cross-functional team that demonstrated the power of enterprise-wide collaboration to drive meaningful change,” says Tom Underkoffler, Executive Director of Logistics and Packaging.

Learn more about our sustainability efforts and how we’re improving the Coram patient experience at cvs.co/smartpack