Set LinuxRT toolchain to 2023Q1#449
Conversation
|
Related to issue #439 |
|
Could you also update the Readme section discussing LabVIEW compatibility while you are here. |
|
I also found that the cross-compile include directories were incorrect and just updated them to align with the 2023Q1 toolchain. Note that this directory was also incorrect for the 2023Q4 toolchain (should have been the 11.3.0 directory), but I believe that the compiler got around it because of the the I've updated the cross-compile cmake file to use the proper include and also added a comment in the Github action to update the include directories when updating the toolchain to match. I've tested re-building using the properly defined include paths and verified functionality on both NILRT 24Q4 and 22Q4. I see no difference in the build output from the previous builds with the incorrect include directory) |
|
@kt-jplotzke it seems something as part of this change is causing the completion of the Linux RT cross compile to not be reported correctly such that the final |
I think I figured it out. There was a branch protection rule that also had to be updated since you renamed the job created by the github action. I'm rerunning things now. Hopefully this will resolve it. |
@jasonmreding Hmm, thanks. Odd since I don't think anything should be renamed. I also can't repro it using a runner in my forked repo. Also odd is that my other PR failed the same I obviously can't see the self-hosted runner which does that final action. I see some Github issues of others talking about self-hosted runners sitting in queued states forever and need a reboot to kick them back off. If this build fails, can you check on that runner and perhaps try a reboot? |
This rename is what I believe was causing things to get hung up. The branch protection rules for master (which you do not have access to) list that action by the old name. I had to remove the old entry and add a new entry with the new name. Going forward, there should no longer be a reason to feel compelled to rename the action because of upgrading gcc versions so hopefully this will not happen again. Yes, the self hosted runner required to perform the final action appears to be offline. I don't have physical access to that machine so there is nothing I can do to resolve the issue. I've notified the people who do have access, but it might take until until tomorrow morning U.S. time before it gets resolved. Sorry. |
In order to increase the compatibility of the grpc-labview library on LinuxRT, I am proposing moving from building with the 2023Q4 toolchain (implemented with PR #432) to the 2023Q1 toolchain. With the current 2023Q4 toolchain, this limits compatibility of the current LinuxRT shared library with NILRT 2023Q4+. If built with the 2023Q1 toolchain, this increases the compatibility back to NILRT 2022Q4+.
This toolchain was likely updated due to the grpc v1.62 package only building under gcc 7.3+ (reference build requirements of grpc 1.62 here. The former NILRT 2018 toolchain only supported gcc 6, and thus needed to be updated. The 2023Q1 toolchain supports gcc 10.3, while the 2023Q4 toolchain supports gcc 11.3.
However, the 2023Q1 toolchain contains libc 2.33 and libstdc++ 10.3 (which are also compatible with NILRT 2022Q4).
The 2023Q4 toolchain contains libc 2.35 and libstdc++ 11.3, which are only available on NILRT 2023Q4+ and causes the compatibility issues.
You can also reference Google's Open Source C++ support matrix, but I do not see any incompatibility with the 2023Q1 toolchain.
Further, as the grpc-labview library is intended for LabVIEW 2019+, restricting to NILRT 2023Q4+ doesn't allow for LabVIEW RT 2019 development (as 2023Q4 only has support for LV 2020+).
For testing, I have built the shared library using the 2023Q1 toolchain and tested both client and server functions using NILRT 2022Q4 and all looks functional.
Reference successful build here: https://github.com/kt-jplotzke/grpc-labview/actions/runs/15144256743/job/42575931912