Consumer Tech

Next Gen Gates: AI Meets Fashion – Gates’ Bold Move to Dress the Future

With Phia’s AI, the new luxury is knowing what’s worth buying.

Updated

December 19, 2025 9:26 PM

Phoebe Gates and Sophia Kianni, founders of Phia. PHOTO: PHIA

AI has transformed how we shop—predicting trends, powering virtual try-ons and streamlining fashion logistics. Yet some of the biggest pain points remain: endless scrolling, too many tabs and never knowing if you’ve overpaid. That’s the gap Phia aims to close.

Co-founded by Phoebe Gates, daughter of Bill Gates, and climate activist Sophia Kianni, Phia was born in a Stanford dorm room and launched in April 2025. The app, available on mobile and as a browser extension, compares prices across over 40,000 retailers and thrift platforms to show what an item really costs. Its hallmark feature, “Should I Buy This?”, instantly flags whether something is overpriced, fair or a genuine deal.

The mission is simple: make shopping smarter, fairer and more sustainable. In just five months, Phia has attracted more than 500,000 users, indexed billions of products and built over 5,000 brand partnerships. It also secured a US$8 million seed round led by Kleiner Perkins, joined by Hailey Bieber, Kris Jenner, Sara Blakely and Sheryl Sandberg—investors who bridge tech, retail and culture. “Phia is redefining how people make purchase decisions,” said Annie Case, partner at Kleiner Perkins.  

Phia’s AI engine scans real-time data from more than 250 million products across its network, including Vestiaire Collective, StockX, eBay and Poshmark. Beyond comparing prices, the app helps users discover cheaper or more sustainable options by displaying pre-owned items next to new ones—helping users see the full spectrum of choices before they buy. It also evaluates how different brands perform over time, analysing how well their products hold resale value. This insight helps shoppers judge whether a purchase is likely to last in value or if opting for a second-hand version makes more sense. The result is a platform that naturally encourages circular shopping—keeping items in use longer through resale, repair or recycling—and resonates strongly with Gen Z and millennial values of sustainability and mindful spending.  

By encouraging transparency and smarter choices, Phia signals a broader shift in consumer technology: one where AI doesn’t just automate decisions but empowers users to understand them. Instead of merely digitizing the act of shopping, Phia embodies data-driven accountability—using intelligent search to help consumers make informed and ethical choices in markets long clouded by complexity. Retail analysts believe this level of visibility could push brands to maintain accurate and competitive pricing. Skeptics, however, argue that Phia must evolve beyond comparison to create emotional connection and loyalty. Still, one fact stands out: algorithms are no longer just recommending what we buy—they’re rewriting how we decide.  

With new funding powering GPU expansion and advanced personalization tools, Phia’s next step is to build a true AI shopping agent—one that helps people buy better, live smarter and rethink what it means to shop with purpose.  

Keep Reading

AI

New Physical AI Technology: How Atomathic’s AIDAR and AISIR Improve Machine Sensing

Redefining sensor performance with advanced physical AI and signal processing.

Updated

December 16, 2025 3:28 PM

Robot with human features, equipped with a visual sensor. PHOTO: UNSPLASH

Atomathic, the company once known as Neural Propulsion Systems, is stepping into the spotlight with a bold claim: its new AI platforms can help machines “see the invisible”. With the commercial launch of AIDAR™ and AISIR™, the company says it is opening a new chapter for physical AI, AI sensing and advanced sensor technology across automotive, aviation, defense, robotics and semiconductor manufacturing.

The idea behind these platforms is simple yet ambitious. Machines gather enormous amounts of signal data, yet they still struggle to understand the faint, fast or hidden details that matter most when making decisions. Atomathic says its software closes that gap. By applying AI signal processing directly to raw physical signals, the company aims to help sensors pick up subtle patterns that traditional systems miss, enabling faster reactions and more confident autonomous system performance.

"To realize the promise of physical AI, machines must achieve greater autonomy, precision and real-time decision-making—and Atomathic is defining that future," said Dr. Behrooz Rezvani, Founder and CEO of Atomathic. "We make the invisible visible. Our technology fuses the rigor of mathematics with the power of AI to transform how sensors and machines interact with the world—unlocking capabilities once thought to be theoretical. What can be imagined mathematically can now be realized physically."

This technical shift is powered by Atomathic’s deeper mathematical framework. The core of its approach is a method called hyperdefinition technology, which uses the Atomic Norm and fast computational techniques to map sparse physical signals. In simple terms, it pulls clarity out of chaos. This enables ultra-high-resolution signal visualization in real time—something the company claims has never been achieved at this scale in real-time sensing.

AIDAR and AISIR are already being trialled and integrated across multiple sectors and they’re designed to work with a broad range of hardware. That hardware-agnostic design is poised to matter even more as industries shift toward richer, more detailed sensing. Analysts expect the automotive sensor market to surge in the coming years, with radar imaging, next-gen ADAS systems and high-precision machine perception playing increasingly central roles.

Atomathic’s technology comes from a tight-knit team with deep roots in mathematics, machine intelligence and AI research, drawing talent from institutions such as Caltech, UCLA, Stanford and the Technical University of Munich. After seven years of development, the company is ready to show its progress publicly, starting with demonstrations at CES 2026 in Las Vegas.

Suppose the future of autonomy depends on machines perceiving the world with far greater fidelity. In that case, Atomathic is betting that the next leap forward won’t come from more hardware, but from rethinking the math behind the signal—and redefining what physical AI can do.