Artificial Intelligence

How Analog Devices Is Turning Hardware Into Intelligence?

The upgraded CodeFusion Studio 2.0 simplifies how developers design, test and deploy AI on embedded systems.

Updated

January 8, 2026 6:34 PM

Illustration of CodeFusion Studio™ 2.0 showing AI, code and chip icons. PHOTO: ANALOG DEVICES, INC.

Analog Devices (ADI), a global semiconductor company, launched CodeFusion Studio™ 2.0 on November 3, 2025. The new version of its open-source development platform is designed to make it easier and faster for developers to build AI-powered embedded systems that run on ADI’s processors and microcontrollers.

“The next era of embedded intelligence requires removing friction from AI development”, said Rob Oshana, Senior Vice President of the Software and Digital Platforms group at ADI. “CodeFusion Studio 2.0 transforms the developer experience by unifying fragmented AI workflows into a seamless process, empowering developers to leverage the full potential of ADI's cutting-edge products with ease so they can focus on innovating and accelerating time to market”.

The upgraded platform introduces new tools for hardware abstraction, AI integration and automation. These help developers move more easily from early design to deployment.

CodeFusion Studio 2.0 enables complete AI workflows, allowing teams to use their own models and deploy them on everything from low-power edge devices to advanced digital signal processors (DSPs).

Built on Microsoft Visual Studio Code, the new CodeFusion Studio offers built-in checks for model compatibility, along with performance testing and optimization tools that help reduce development time. Building on these capabilities, a new modular framework based on Zephyr OS lets developers test and monitor how AI and machine learning models perform in real time. This gives clearer insight into how each part of a model behaves during operation and helps fine-tune performance across different hardware setups.

Additionally, the CodeFusion Studio System Planner has also been redesigned to handle more device types and complex, multi-core applications. With new built-in diagnostic and debugging features — like integrated memory analysis and visual error tracking — developers can now troubleshoot problems faster and keep their systems running more efficiently.

This launch marks a deeper pivot for ADI. Long known for high-precision analog chips and converters, the company is expanding its edge-AI and software capabilities to enable what it calls Physical Intelligence — systems that can perceive, reason, and act locally.  

“Companies that deliver physically aware AI solutions are poised to transform industries and create new, industry-leading opportunities. That's why we're creating an ecosystem that enables developers to optimize, deploy and evaluate AI models seamlessly on ADI hardware, even without physical access to a board”, said Paul Golding, Vice President of Edge AI and Robotics at ADI. “CodeFusion Studio 2.0 is just one step we're taking to deliver Physical Intelligence to our customers, ultimately enabling them to create systems that perceive, reason and act locally, all within the constraints of real-world physics”.

Keep Reading

Artificial Intelligence

X-Humanoid Introduces Tien Kung 3.0 as Deployment Challenges Persist in Humanoid Robotics

A closer look at the tech, AI, and open ecosystem behind Tien Kung 3.0’s real-world push

Updated

February 18, 2026 8:03 PM

Humanoid robots working in a warehouse. PHOTO: ADOBE STOCK

Humanoid robotics has advanced quickly in recent years. Machines can now walk, balance, and interact with their surroundings in ways that once seemed out of reach. Yet most deployments remain limited. Many robots perform well in controlled settings but struggle in real-world environments. Integration is often complex, hardware interfaces are closed, software tools are fragmented, and scaling across industries remains difficult.

Against this backdrop, X-Humanoid has introduced its latest general-purpose platform, Embodied Tien Kung 3.0. The company positions it not simply as another humanoid robot, but as a system designed to address the practical barriers that have slowed adoption, with a focus on openness and usability.

At the hardware level, Embodied Tien Kung 3.0 is built for mobility, strength, and stability. It is equipped with high-torque integrated joints that provide strong limb force for high-load applications. The company says it is the first full-size humanoid robot to achieve whole-body, high-dynamic motion control integrated with tactile interaction. In practice, this means the robot is designed to maintain balance and execute dynamic movements even in uneven or cluttered environments. It can clear one-meter obstacles, perform consecutive high-dynamic maneuvers, and carry out actions such as kneeling, bending, and turning with coordinated whole-body control.

Precision is also a focus. Through multi-degree-of-freedom limb coordination and calibrated joint linkage, the system is designed to achieve millimeter-level operational accuracy. This level of control is intended to support industrial-grade tasks that require consistent performance and minimal error across changing conditions.

But hardware is only part of the equation. The company pairs the robot with its proprietary Wise KaiWu general-purpose embodied AI platform. This system supports perception, reasoning, and real-time control through what the company describes as a coordinated “brain–cerebellum” architecture. It establishes a continuous perception–decision–execution loop, allowing the robot to operate with greater autonomy and reduced reliance on remote control.

For higher-level cognition, Wise KaiWu incorporates components such as a world model and vision-language models (VLM) to interpret visual scenes, understand language instructions, and break complex objectives into structured steps. For real-time execution, a vision-language-action (VLA) model and full autonomous navigation system manage obstacle avoidance and precise motion under variable conditions. The platform also supports multi-agent collaboration, enabling cross-platform compatibility, asynchronous task coordination, and centralized scheduling across multiple robots.

A central part of the platform is openness. The company states that the system is designed to address compatibility and adaptation challenges across both development and deployment layers. On the hardware side, Embodied Tien Kung 3.0 includes multiple expansion interfaces that support different end-effectors and tools, allowing faster adaptation to industrial manufacturing, specialized operations, and commercial service scenarios. On the software side, the Wise KaiWu ecosystem provides documentation, toolchains, and a low-code development environment. It supports widely adopted communication standards, including ROS2, MQTT, and TCP/IP, enabling partners to customize applications without rebuilding core systems.

The company also highlights its open-source approach. X-Humanoid has open-sourced key components from the Embodied Tien Kung and Wise KaiWu platforms, including the robot body architecture, motion control framework, world model, embodied VLM and cross-ontology VLA models, training toolchains, the RoboMIND dataset, and the ArtVIP simulation asset library. By opening access to these elements, the company aims to reduce development costs, lower technical barriers, and encourage broader participation from researchers, universities, and enterprises.

Embodied Tien Kung 3.0 enters a market where technical progress is visible but large-scale adoption remains uneven. The gap is not only about movement or strength. It is about integration, interoperability, and the ability to operate reliably and autonomously in everyday industrial and commercial settings. If platforms can reduce fragmentation and simplify deployment, humanoid robots may move beyond demonstrations and into sustained commercial use.

In that sense, the significance of Embodied Tien Kung 3.0 lies less in isolated technical claims and more in how its high-dynamic hardware, embodied AI system, open interfaces, and collaborative architecture are structured to work together. Whether that integrated approach can close the deployment gap will shape how quickly humanoid robotics becomes part of real-world operations.