· 3 min read

How I Fixed 30-Minute PyOCD Updates (One Line Fix)

A simple upgrade fixed a 30-minute PyOCD torture into a 1-minute inconvenience. Here's the root cause and why your embedded CI might be hitting the same issue.

A simple upgrade fixed a 30-minute PyOCD torture into a 1-minute inconvenience. Here's the root cause and why your embedded CI might be hitting the same issue.

The Setup: Preparing for a Workshop

I was putting together a devcontainer for an upcoming Zephyr RTOS workshop. The idea was simple - give all participants identical development environments with Docker containers, so nobody wastes time on “it works on my machine” problems.

The workshop needed to cover MCXA156 microcontrollers, which meant adding support for this MCU to the development environment. In Zephyr, you can use PyOCD as the debugger and programmer, but I needed to update its pack index to include device support files for the new hardware.

This should have been routine. Run pyocd pack update, wait a minute or two, done.

When Routine Became Torture

Instead, I got this:

  ~ time pyocd pack update
0000311 I Updating pack index... [pack_cmd]
Downloading descriptors (023/1659)

And then… nothing. The progress counter stuck at 023/1659. I waited. Made coffee. Came back. Still stuck.

Meanwhile, I started getting messages from people testing the devcontainer setup: “PyOCD updates are taking forever - sometimes 30 minutes!” At first, I thought they were exaggerating or had network issues.

Then my own update finally completed:

pyocd pack update  2.79s user 3.51s system 1% cpu 6:54.96 total

6 minutes and 54 seconds. And that was one of the “fast” runs. Some participants reported waiting 30 minutes or more.

The CI Pain Point

This wasn’t just a local annoyance. Our GitHub Actions CI was burning through minutes, and I’d noticed similar issues 3-6 months earlier on STM32 projects. Back then, we worked around it by manually caching pack files, but that was treating symptoms, not the root cause.

The pattern was always the same:

  • CI would start normally
  • Hit pyocd pack update
  • Stall for ages on “Downloading descriptors”
  • Eventually timeout or complete after burning 20-30 minutes

Debugging the Root Cause

After some digging, I found GitHub issue #1783 describing exactly this problem. The culprit wasn’t PyOCD itself, but cmsis-pack-manager - the underlying package that handles CMSIS-Pack downloads.

The technical breakdown:

  • PyOCD uses cmsis-pack-manager for package management
  • Older versions (like my 0.5.3) had no timeout handling for corrupted registry entries
  • When the pack registry had broken or slow entries, the tool would hang indefinitely
  • No progress indication beyond the initial “Downloading descriptors” message

The Fix

The solution came with PR #242 in cmsis-pack-manager 0.6.0, which added download timeouts for PDSC entries.

The upgrade:

pip install --upgrade cmsis-pack-manager

The result:

  ~ time pyocd pack update 
0000711 I Updating pack index... [pack_cmd]
Downloading descriptors (1659/1659)
pyocd pack update  2.61s user 5.22s system 13% cpu 56.886 total

From 6:54 to 56 seconds. Not perfect, but usable.

What This Means for Your Setup

If you’re experiencing:

  • Slow PyOCD pack updates
  • CI timeouts on embedded builds
  • Hanging “Downloading descriptors” messages

Check your cmsis-pack-manager version:

pip show cmsis-pack-manager

If it’s below 0.6.0, upgrade:

pip install --upgrade cmsis-pack-manager

The Changelog Evidence

From cmsis-pack-manager 0.6.0 release notes:

  • “Add download timeouts for PDSC entries by @danieldegrasse in #242”

Sometimes the most annoying problems have the simplest solutions. A single pip install --upgrade turned 30-minute torture sessions into manageable 1-minute waits.

Lessons Learned

  1. Check dependencies, not just main tools - PyOCD wasn’t the problem
  2. Timeouts matter - Network operations without timeouts are time bombs
  3. Community fixes exist - Someone probably already solved your problem
  4. CI minutes are expensive - Both in cost and developer frustration
Back to Blog

Get in Touch

Ready to discuss your embedded project? Let's connect!