Arm

company
Verified

AI & ML interests

Resources, tools and content from Arm and our partner ecosystem that enable you to deploy your workloads quickly, efficiently and securely.

Arm’s AI development resources ensure you can deploy at pace, achieving best performance on Arm by default. Our aim is to make your AI development easier, ensuring integration with all major operating systems and AI frameworks, enabling portability for deploying AI on Arm at scale.

Discover below some key resources and content from Arm, including our software libraries and tools, that enable you to optimize for Arm architectures and pass-on significant performance uplift for models – from traditional ML and computer vision workloads to small and large language models - running on Arm-based devices.


Arm Kleidi: Unleashing Mass-Market AI Performance on Arm

Arm Kleidi is a targeted software suite, expediting optimizations for any framework and enabling accelerations for billions of AI workloads across Arm-based devices everywhere. Application developers achieve top performance by default, with no additional work or investment in new skills or tools training required.

Useful Resources on Arm Kleidi:


Running LLMs on Mobile

Our foundation of pervasiveness, flexible performance and energy efficiency mean that Arm CPUs are already the hardware of choice for a variety of AI workloads. Alongside Arm-based servers excelling with LLM workloads, the Arm Kleidi software suite, optimizations to our software libraries, combined with the open-source llama.cpp project enable generative AI to run efficiently on mobile devices.

Our work includes a virtual assistant demo which at first utilized Meta’s Llama2-7B LLM on mobile via a chat-based application, and has since expanded to include the Llama3 model and Phi-3 3.8B. You can learn more about the technical implementation of the demos here.

Find out more about the community contributions that make this happen:

These advancements are also highlighted in our Learning Paths below.


AI on Arm in the Cloud

Arm Neoverse platforms give our infrastructure partners access to leading performance, efficiency and unparalleled flexibility to innovate in pursuit of the optimal solutions for emerging AI workloads. The flexibility of the Neoverse platform enables our innovative hardware partners to closely integrate additional compute acceleration into their designs, creating a new generation of built-for-AI custom data center silicon.

Read the latest on AI-on-Neoverse:


Arm Learning Paths

Tutorials designed to help you develop quality Arm software faster.

Contribute to our Learning Paths: suggest a new Learning Path or create one yourself with support from the Arm community.


Note: The data collated here is sourced from Arm and third parties. While Arm uses reasonable efforts to keep this information accurate, Arm does not warrant (express or implied) or provide any guarantee of data correctness due to the ever-evolving AI and software landscape. Any links to third party sites and resources are provided for ease and convenience. Your use of such third-party sites and resources is subject to the third party’s terms of use, and use is at your own risk.

models

None public yet

datasets

None public yet