Health & Biotech

How Ultromics Is Focusing on Early Heart Failure Detection With Women’s Health in Mind

A new bet on early heart failure detection and why women’s health is at the center.

Updated

January 8, 2026 6:28 PM

A doctor holding an artificial heart model. PHOTO: ADOBE STOCK

Heart disease does not always announce itself clearly, especially in women. Many of the symptoms are ordinary, including fatigue, shortness of breath and swelling. These signs are frequently dismissed or explained away. As a result, many women are diagnosed late, when treatment options are narrower and outcomes are worse. That diagnostic gap is the context behind a recent investment involving Ultromics and the American Heart Association’s Go Red for Women Venture Fund.

Ultromics is a health technology company that uses artificial intelligence to help doctors spot early signs of heart failure from routine heart scans. It has received a strategic investment from the American Heart Association’s Go Red for Women Venture Fund.

The focus of the investment is a long-standing blind spot in cardiac care. Heart failure with preserved ejection fraction, or HFpEF, affects millions of people worldwide, with women disproportionately impacted. It is one of the most common forms of heart failure, yet also one of the hardest to diagnose. Studies even show women are twice as likely as men to develop the condition and around 64% of cases go undiagnosed in routine clinical practice.  

Ultromics works with a tool most patients already experience during heart care: the echocardiogram. There is no new scan and no added burden for patients. Its software analyzes standard heart ultrasound images and looks for subtle patterns that point to early heart failure. The goal is clarity. Give clinicians better signals earlier, before the disease advances.

“Heart failure with preserved ejection fraction is one of the most complex and overlooked diseases in cardiology. For too long, clinicians have been expected to diagnose it using tools that weren't built to detect it and as a result, many patients are identified too late,” said Ross Upton, PhD, CEO and Founder of Ultromics. “By augmenting physicians' decision making with EchoGo, we can help them recognize disease at an earlier stage and treat it more effectively.”

The stakes are high. Research suggests women are twice as likely as men to develop the condition and that a majority of cases are missed in routine clinical practice. That delay matters. New therapies can reduce hospitalizations and improve survival, but only if patients are diagnosed in time.

This is why early detection has become a priority for mission-driven investors. “Closing the diagnostic gap by recognizing disease before irreversible damage occurs is critical to improving health for women—and everyone,” said Tracy Warren, Senior Managing Director, Go Red for Women Venture Fund. “We are gratified to see technologies, such as this one, that are accepted by leading institutions as advances in the field of cardiovascular diagnostics. That's the kind of progress our fund was created to accelerate.”

Ultromics’ platform is already cleared by regulators for clinical use and is being deployed in hospitals across the US and UK. The company says its technology has analyzed hundreds of thousands of heart scans, helping clinicians reach clearer conclusions when traditional methods fall short.

Taken together, the investment reflects a broader shift in healthcare. Attention is shifting earlier—toward detection instead of reaction. Toward tools that fit into existing care rather than complicate it. In this case, the funding is not about introducing something new into the system. It is about seeing what has long been missed—and doing so in time.

Keep Reading

Artificial Intelligence

Can a Toy Teach a Child to Read Like a Human Would? Inside the Rise of AI Reading Companions

A closer look at how reading, conversation, and AI are being combined

Updated

February 7, 2026 2:18 PM

Assorted plush character toys piled inside a glass claw machine. PHOTO: ADOBE STOCK

In the past, “educational toys” usually meant flashcards, prerecorded stories or apps that asked children to tap a screen. ChooChoo takes a different approach. It is designed not to instruct children at them, but to talk with them.

ChooChoo is an AI-powered interactive reading companion built for children aged three to six. Instead of playing stories passively, it engages kids in conversation while reading. It asks questions, reacts to answers, introduces new words in context and adjusts the story flow based on how the child responds. The goal is not entertainment alone, but language development through dialogue.

That idea is rooted in research, not novelty. ChooChoo is inspired by dialogic reading methods from Yale’s early childhood language development work, which show that children learn language faster when stories become two-way conversations rather than one-way narration. Used consistently, this approach has been shown to improve vocabulary, comprehension and confidence within weeks.

The project was created by Dr. Diana Zhu, who holds a PhD from Yale and focused her work on how children acquire language. Her aim with ChooChoo was to turn academic insight into something practical and warm enough to live in a child’s room. The result is a device that listens, responds and adapts instead of simply playing content on command.

What makes this possible is not just AI, but where that AI runs.

Unlike many smart toys that rely heavily on the cloud, ChooChoo is built on RiseLink’s edge AI platform. That means much of the intelligence happens directly on the device itself rather than being sent back and forth to remote servers. This design choice has three major implications.

First, it reduces delay. Conversations feel natural because the toy can respond almost instantly. Second, it lowers power consumption, allowing the device to stay “always on” without draining the battery quickly. Third, it improves privacy. Sensitive interactions are processed locally instead of being continuously streamed online.

RiseLink’s hardware, including its ultra-low-power AI system-on-chip designs, is already used at large scale in consumer electronics. The company ships hundreds of millions of connected chips every year and works with global brands like LG, Samsung, Midea and Hisense. In ChooChoo’s case, that same industrial-grade reliability is being applied to a child’s learning environment.

The result is a toy that behaves less like a gadget and more like a conversational partner. It engages children in back-and-forth discussion during stories, introduces new vocabulary in natural context, pays attention to comprehension and emotional language and adjusts its pace and tone based on each child’s interests and progress. Parents can also view progress through an optional app that shows what words their child has learned and how the system is adjusting over time.

What matters here is not that ChooChoo is “smart,” but that it reflects a shift in how technology enters early education. Instead of replacing teachers or parents, tools like this are designed to support human interaction by modeling it. The emphasis is on listening, responding and encouraging curiosity rather than testing or drilling.

That same philosophy is starting to shape the future of companion robots more broadly. As edge AI improves and hardware becomes smaller and more energy efficient, we are likely to see more devices that live alongside people instead of in front of them. Not just toys, but helpers, tutors and assistants that operate quietly in the background, responding when needed and staying out of the way when not.

In that sense, ChooChoo is less about novelty and more about direction. It shows what happens when AI is designed not for spectacle, but for presence. Not for control, but for conversation.

If companion robots become part of daily life in the coming years, their success may depend less on how powerful they are and more on how well they understand when to speak, when to listen and how to grow with the people who use them.