The Raspberry Pi AI HAT+ 2 is the newest PCIe add-on for the Raspberry Pi 5, and the headline change is simple: it puts 8 GB of onboard LPDDR4X RAM next to a Hailo-10H accelerator, so more of the model’s working set can live “close” to the NPU instead of using a thin straw trying to drink from the firehose of system memory.
 

Top-down view of the Raspberry Pi AI HAT+ 2 mounted on a Raspberry Pi 5, with the host board blurred against a blue tech-pattern background.
Raspberry Pi AI HAT+ 2 shown fitted to a Raspberry Pi 5 via the PCIe connection, pairing a Hailo accelerator with onboard memory.

What’s New

The previous Raspberry Pi-branded Hailo add-ons for Pi 5 were mainly framed as vision accelerators: object detection, segmentation, pose estimation, and other camera-first pipelines. The AI HAT+ 2 keeps that “bolt-on NPU” idea, but the added 8 GB of onboard RAM is the real story, because it shifts the feasible workload mix toward models that are memory-hungry even when compute is available.

Raspberry Pi AI HAT+ 2 Specs

A PCIe AI/ML accelerator add-on for Raspberry Pi 5, designed to accelerate vision models plus selected LLM and VLM workloads.
 

Key Specifications

  • Hailo-10H neural network accelerator
  • 8 GB onboard LPDDR4X-4267 SDRAM
  • PCI Express attachment; Raspberry Pi 5 only
  • Vision performance positioned as similar to the 26 TOPS AI HAT+
  • Accelerates selected LLMs and VLMs

Expected LLM/VLM support at launch

  •  Llama-3.2-3B-Instruct
  • Qwen2.5-VL-3B

Physical

  •  Dimensions: 56.7 mm (W) × 65.1 mm (L) × 5.5 mm (H)
  • Weight: 19 g (board) / 48 g (carton with accessories)
 

While the specifications look promising on paper, a few details remain to be confirmed. The memory is listed as “LPDDR4X-4267 SDRAM,” but Hailo’s public documentation for the 10H typically states only “4 | 8 GB LPDDR4/4X” without specifying the speed grade. If the 4267 MT/s data rate is accurate, that’s a solid choice for memory-bound GenAI workloads — but until we see official confirmation, consider it provisional.
 

A spec summary graphic for Raspberry Pi AI HAT+ 2 showing the board image and bullet points for key features, supported models, and physical dimensions.
The Raspberry Pi AI HAT+ 2 combines a Hailo-10H accelerator with 8 GB LPDDR4X memory for Raspberry Pi 5, targeting vision plus selected LLM/VLM workloads.


Similarly, the claim that vision performance will be “similar to the 26 TOPS AI HAT+” is difficult to verify. The Hailo-8 option in the earlier HAT+ is vision-optimized, while the 10H trades some of that specialization for GenAI versatility. The two chips have different architectures, so direct TOPS-to-TOPS comparisons don't tell the whole story.

There’s also the matter of PCIe bandwidth. Raspberry Pi 5 offers a single-lane PCIe 2.0 interface, which tops out around 500 MB/s *. The Hailo-10H module, on the other hand, is designed for PCIe Gen 3.0 ×4, which can theoretically deliver up to 4 GB/s. That’s an eight-fold mismatch. For lightweight inference or models that fit entirely in the 10H’s onboard RAM, the bottleneck may not be noticeable, but for workloads that require frequent data movement between the Pi and the accelerator, performance could take a hit. The good news is that the onboard memory exists precisely to minimize that traffic, so real-world impact will depend heavily on the specific models and use cases you're running.
* With config overrides, the external connector can be set to Gen 3 ×1 (~1 GB/s), but this mode is not part of Raspberry Pi’s official certification.

The design lines up neatly with how Hailo positions Hailo-10H. The company markets it as a generative-AI-capable edge accelerator with a direct DDR interface so it can scale to larger models (LLMs and VLMs) that are often bottlenecked by memory locality rather than raw TOPS alone; the overview is on Hailo’s product page.

How the AI HAT+ 2 Fits the Pi 5 PCIe Ecosystem

If your baseline is the existing AI HAT+ line (13 TOPS and 26 TOPS), that board is still the simplest vision-first route on the Pi 5. Raspberry Pi’s own accessory page for the original AI HAT+ describes it as a Hailo-equipped add-on aimed at running multiple concurrent AI tasks, and for pure camera throughput it remains the obvious “start here” choice.
 

Angled view of the Raspberry Pi AI HAT+ 2 mounted on a Raspberry Pi 5, with the host board blurred on a blue tech-pattern background.
The Raspberry Pi AI HAT+ 2 fits the Raspberry Pi 5 PCIe ecosystem.


Before the AI HAT+ became the default recommendation, Raspberry Pi pushed the AI Kit bundle: their M.2 HAT+ plus a Hailo-8L module. It’s still a tidy way to add Hailo acceleration, and it’s also what many readers already own. For an Elektor refresher on that approach, see our earlier coverage.
 

And, of course, there’s the plumbing option: the M.2 HAT+ itself, which effectively turns Pi 5’s single-lane PCIe 2.0 into an M.2 slot. In practice, it’s often an either/or decision (NVMe SSD or an accelerator module, one at a time, please.). That context is covered in our M.2 HAT+ launch piece.

AI HAT+ 2 Model Targets: LLMs, VLMs, and Whisper

Where the AI HAT+ 2 tries to justify its existence is in “selected” generative and multimodal workloads. In other words: we’re not just pushing more bounding boxes per second, but making a small set of LLM and VLM-class models practical on a Pi-class host by giving the accelerator its own memory pool.

Concretely, the expected/early target list includes small instruct models and compact VLMs such as Llama-3.2-3B-Instruct and Qwen2.5-VL-3B, plus speech workloads such as Whisper (for example, Whisper-Base appears in Hailo’s GenAI model explorer). Partner material has also pointed at distilled reasoning models; as one example, DeepSeek-R1-Distill-Qwen-1.5B is listed for Hailo-10H in that same model-explorer family.

The practical takeaway is that the AI HAT+ 2 is less about replacing the 26 TOPS AI HAT+ for classic vision, and more about expanding what “Pi 5 + accelerator” can credibly run when model size and intermediate activations become the limiter.

Bottom Line

If you already have the 26 TOPS AI HAT+ and your job is camera/vision throughput, the Raspberry Pi AI HAT+ 2 does not read like a must-upgrade on vision alone. The reason to step up is the added onboard memory on the accelerator side, which is specifically what makes selected LLM, VLM, and Whisper-class workloads plausible on Raspberry Pi 5 without immediately falling over on memory pressure.

Subscribe
Tag alert: Subscribe to the tag Raspberry Pi and you will receive an e-mail as soon as a new item about it is published on our website!