I’m new to PX4 and I’ve spent the last week or so just trying to get PX4 SITL running/to build on macOS, and i’m hoping for a lifeline because I’m about to give up. I’ve gone through almost every combination I can find in the docs, YouTube, and old forum threads. Nothing builds cleanly and it seems similar forum posts are pre-2020 and/or have no response.
My setup
Intel iMac + M3 MacBook Pro both running Tahoe 26.1
Tried both native ARM and Rosetta (x86_64) shells
Also tried using a pixi environment instead of conda since ive had better luck with gazebo and ros2 using pixi
Tested with Gazebo Harmonic, Gazebo Fortress, and jMAVSim
PX4 versions: latest main, and 1.16
Depending on which environment I build in (native vs Rosetta, brew vs pixi vs conda), I end up hitting one or more errors below:
1. Protobuf / Abseil
fatal error: “Protobuf C++ gencode is built with an incompatible version of” absl/base/policy_checks.h: C++ versions less than C++17 are not supported
2. Gazebo plugin linkage
CMake Error at src/modules/simulation/gz_plugins/optical_flow/CMakeLists.txt:47 (target_link_libraries):
Target "OpticalFlowSystem" links to:
gz-sim::gz-sim
but the target was not found.
...
What I’ve already tried
Clean rebuilds (rm -rf build/px4_sitl_default)
Running Tools/setup/macos.sh
Reinstalling protobuf, ffmpeg, abseil under both /opt/homebrew (ARM) and /usr/local (x86)
Installing kconfiglib, pyros-genmsg, jsonschema etc. under the same interpreter PX4 is using
Using Pixi + RoboStack (ROS2 Humble + Gazebo)
Switching between ARM and x86_64 brew prefixes
At this point I’ve spent days swapping shells, compilers, and package sources, and I still can’t get a successful build.
My question
Is PX4 SITL actually known to build successfully on macOS right now (Intel and/or M-series)?
If so, could someone please share the exact environment or steps that work with what combo?
If not, it is what it is, but I’d like to confirm whether it’s just me or if macOS support is effectively broken.
Gotcha gotcha, it would be wonderful to get better dev team support for Mac. A couple things I had to workaround
macos.sh implies out of date Python path management, created a venv and manually pip installed the requirements.txt
The toolchain instructions refer to adoptopenjdk15, but the internal javasim uses Java 69 class definitions, requiring Java 25 - so should install adoptopenjdk16
Circling back to the original issue in this thread - I believe it has to do with the fact that protobuf now takes a hard dependency on abseil, who’s most recent bottle asserts the use of cpp 17 or newer. Scanning the gazebo modules, there are a bunch of cpp 14 standards, which triggers the version flag in asbeil. I tried hacking around the brew versioning using a local build of abseil, but didn’t have any luck. I think path forward is updating any deprecated 14 standard cpp gazebo sim code to 17.
Apologies for the late reply, I had to make do with JMavSim in the short term, but looking to circle back to using Gazebo.
Bumping the CMake standard version only gets me so far. I run into several issues building OpenCV/OpticalFlow -
First I hit a conversion warning
In file included from /usr/local/Cellar/abseil/20250814.1/include/absl/container/internal/layout.h:198:
/usr/local/Cellar/abseil/20250814.1/include/absl/strings/str_cat.h:352:61: fatal error: implicit conversion increases floating-point precision: 'float' to 'double' [-Wdouble-promotion]
352 | : piece_(digits_, numbers_internal::SixDigitsToBuffer(f, digits_)) {}
| ~~~~~~~~~~~~~~~~ ^
1 error generated.
ninja: build stopped: subcommand failed.
make: *** [px4_sitl] Error 1
I sidestepped the warning by adding `-Wno-double-promotion` to the top level `px4_add_common_flags.cmake` file
but then encounter
4.12.0.dylib /usr/local/lib/libopencv_imgproc.4.12.0.dylib /usr/local/lib/libopencv_core.4.12.0.dylib && :
clang++: error: no such file or directory: '/usr/local/lib/libOpticalFlow.so'
[500/505] Building CXX object src/modules/simulation/gz_plugins/buoyancy/CMakeFiles/BuoyancySystemPlugin.dir/BuoyancySystem.cpp.o
ninja: build stopped: subcommand failed.
so then I change line 50 and 57 of optical_flow.cmake to expect .dylib (mac standard) instead of .so
INFO [init] Waiting for Gazebo world...
INFO [init] Waiting for Gazebo world...
INFO [init] Waiting for Gazebo world...
INFO [init] Waiting for Gazebo world...
INFO [init] Waiting for Gazebo world...
ERROR [init] Timed out waiting for Gazebo world
ERROR [px4] Startup script returned with return value: 256
FAILED: [code=255] src/modules/simulation/gz_bridge/CMakeFiles/gz_x500 /Users/johannpally/PX4-Autopilot/build/px4_sitl_default/src/modules/simulation/gz_bridge/CMakeFiles/gz_x500
cd /Users/johannpally/PX4-Autopilot/build/px4_sitl_default/src/modules/simulation/gz_bridge && /usr/local/bin/cmake -E env PX4_SIM_MODEL=gz_x500 GZ_IP=127.0.0.1 /Users/johannpally/PX4-Autopilot/build/px4_sitl_default/bin/px4
ninja: build stopped: subcommand failed.
make: *** [px4_sitl] Error 255
It finally starts a PX4 shell, but the gazebo instance took longer than the timeout so exited. Reran the sim start command to finally mount the PX4 instance to the Gazebo instance and we’re smooth sailing now.
Is the development process easier on other OS? Or Arm? I’m on Intel Mac and find I’m dusting a lot of cobwebs here.