A look at how motivation, not metrics, is becoming the real frontier in fitness tech
Updated
February 7, 2026 2:18 PM

A group of people running together. PHOTO: FREEPIK
Most running apps focus on measurement. Distance, pace, heart rate, badges. They record activity well, but struggle to help users maintain consistency over time. As a result, many people track diligently at first, then gradually disengage.
That drop-off has pushed developers to rethink what fitness technology is actually for. Instead of just documenting activity, some platforms are now trying to influence behaviour itself. Paceful, an AI-powered running platform developed by SportsTech startup xCREW, is part of that shift — not by adding more metrics, but by focusing on how people stay consistent. The platform is built on a simple behavioural insight: most people don’t stop exercising because they don’t care about health. They stop because routines are fragile. Miss a few days and the habit collapses. Technology that focuses only on performance metrics doesn’t solve that. Systems that reinforce consistency, belonging and feedback loops might.
Instead of treating running as a solo, data-driven task, Paceful is built around two ideas: behavioural incentives and social alignment. The system turns real-world running activity into tangible rewards and it uses AI to connect runners to people, clubs and challenges that fit how and where they actually run.
At the technical level, Paceful connects with existing fitness ecosystems. Users can import workout data from platforms like Apple Health and Strava rather than starting from scratch. Once inside the system, AI models analyse pace, frequency, location and participation patterns. That data is used to recommend running partners, clubs and group challenges that match each runner’s habits and context.
What makes this approach different is not the tracking itself, but what the platform does with the data it collects. Running distance and consistency become inputs for a reward system that offers physical-world incentives, such as gear, race entries or gift cards. The idea is to link effort to something concrete, rather than abstract. The company also built the system around community logic rather than individual competition. Even solo runners are placed into challenge formats designed to simulate the motivation of a group. In practice, that means users feel part of a shared structure even when running alone.
During a six-month beta phase in the US, xCREW tested Paceful with more than 4,000 running clubs and around 50,000 runners. According to the company, users increased their running frequency significantly and weekly retention remained unusually high for a fitness platform. One beta tester summed it up this way: “Strava just logs records, but Paceful rewards you for every run, which is a completely different motivation”.
The company has raised seed funding and plans to expand the platform beyond running, walking, trekking, cycling and swimming. Instead of asking how accurately technology can measure the body, platforms like Paceful are asking a different question: how technology might influence everyday behaviour. Not by adding more data, but by shaping the conditions around effort, feedback and social connection.
As AI becomes more common in consumer products, its real impact may depend less on how advanced the models are and more on what they are applied to. In this case, the focus isn’t speed or performance — it’s consistency. And whether systems like this can meaningfully support it over time.
Keep Reading
A step forward that could influence how smart contracts are designed and verified.
Updated
January 8, 2026 6:32 PM

ChainGPT's robot mascot. IMAGE: CHAINGPT
A new collaboration between ChainGPT, an AI company specialising in blockchain development tools and Secret Network, a privacy-focused blockchain platform, is redefining how developers can safely build smart contracts with artificial intelligence. Together, they’ve achieved a major industry first: an AI model trained exclusively to write and audit Solidity code is now running inside a Trusted Execution Environment (TEE). For the blockchain ecosystem, this marks a turning point in how AI, privacy and on-chain development can work together.
For years, smart-contract developers have faced a trade-off. AI assistants could speed up coding and security reviews, but only if developers uploaded their most sensitive source code to external servers. That meant exposing intellectual property, confidential logic and even potential vulnerabilities. In an industry where trust is everything, this risk held many teams back from using AI at all.
ChainGPT’s Solidity-LLM aims to solve that problem. It is a specialised large language model trained on over 650,000 curated Solidity contracts, giving it a deep understanding of how real smart contracts are structured, optimised and secured. And now, by running inside SecretVM, the Confidential Virtual Machine that powers Secret Network’s encrypted compute layer, the model can assist developers without ever revealing their code to outside parties.
“Confidential computing is no longer an abstract concept,” said Luke Bowman, COO of the Secret Network Foundation. “We've shown that you can run a complex AI model, purpose-built for Solidity, inside a fully encrypted environment and that every inference can be verified on-chain. This is a real milestone for both privacy and decentralised infrastructure”.
SecretVM makes this workflow possible by using hardware-backed encryption to protect all data while computations take place. Developers don’t interact with the underlying hardware or cryptography. Instead, they simply work inside a private, sealed environment where their code stays invisible to everyone except them—even node operators. For the first time, developers can generate, test and analyse smart contracts with AI while keeping every detail confidential.
This shift opens new possibilities for the broader blockchain community. Developers gain a private coding partner that can streamline contract logic or catch vulnerabilities without risking leaks. Auditors can rely on AI-assisted analysis while keeping sensitive audit material protected. Enterprises working in finance, healthcare or governance finally have a path to adopt AI-driven blockchain automation without raising compliance concerns. Even decentralised organisations can run smart-contract agents that make decisions privately, without exposing internal logic on a public chain.
The system also supports secure model training and fine-tuning on encrypted datasets. This enables collaborative AI development without forcing anyone to share raw data—a meaningful step toward decentralised and privacy-preserving AI at scale.
By combining specialised AI with confidential computing, ChainGPT and Secret Network are shifting the trust model of on-chain development. Instead of relying on centralised cloud AI services, developers now have a verifiable, encrypted environment where they keep full control of their code, their data and their workflow. It’s a practical solution to one of blockchain’s biggest challenges: using powerful AI tools without sacrificing privacy.
As the technology evolves, the roadmap includes confidential model fine-tuning, multi-agent AI systems and cross-chain use cases. But the core advancement is already clear: developers now have a way to use AI for smart contract development that is fast, private and verifiable—without compromising the security standards that decentralised systems rely on.