Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .github/actions/build-and-test/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,13 @@ runs:
python3-pip \
build-essential \

- name: 🔗 Install rosdep dependencies
shell: bash
run: |
sudo rosdep init || true
rosdep update
rosdep install --from-paths . --ignore-src -r -y

- name: 🧱 Build workspace
shell: bash
run: |
Expand Down
74 changes: 74 additions & 0 deletions deep_core/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# Copyright (c) 2025-present WATonomous. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

cmake_minimum_required(VERSION 3.22)
project(deep_core)

if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 17)
endif()

if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
add_link_options(-Wl,-no-undefined)
endif()

find_package(ament_cmake REQUIRED)
find_package(rclcpp REQUIRED)
find_package(rclcpp_lifecycle REQUIRED)
find_package(pluginlib REQUIRED)
find_package(bondcpp REQUIRED)

set(include_dir ${CMAKE_CURRENT_SOURCE_DIR}/include)

# deep_core library
set(DEEP_CORE_LIB ${PROJECT_NAME}_lib)
add_library(${DEEP_CORE_LIB} SHARED
src/deep_node_base.cpp
src/tensor.cpp
src/backend_memory_allocator.cpp
src/backend_inference_executor.cpp
)
target_include_directories(${DEEP_CORE_LIB} PUBLIC
$<BUILD_INTERFACE:${include_dir}>
$<INSTALL_INTERFACE:include>
)
target_link_libraries(${DEEP_CORE_LIB}
PUBLIC
pluginlib::pluginlib
PRIVATE
rclcpp::rclcpp
rclcpp_lifecycle::rclcpp_lifecycle
bondcpp::bondcpp
)

install(TARGETS
${DEEP_CORE_LIB}
EXPORT ${PROJECT_NAME}Targets
ARCHIVE DESTINATION lib
LIBRARY DESTINATION lib
RUNTIME DESTINATION bin
)

install(EXPORT ${PROJECT_NAME}Targets
NAMESPACE ${PROJECT_NAME}::
DESTINATION share/${PROJECT_NAME}/cmake
)

install(DIRECTORY include/
DESTINATION include
)

ament_export_targets(${PROJECT_NAME}Targets HAS_LIBRARY_TARGET)
ament_package()
71 changes: 71 additions & 0 deletions deep_core/DEVELOPING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
# deep_core developing doc

## Design Principles

- Tensor is a smart pointer, not a traditional tensor class, it points to data in memory allocated by the backend memory allocator
- DeepNodeBase handles plugin loading automatically via parameters
- All backends are plugins - no hard framework dependencies
- Memory allocators enable zero-copy GPU integration

## Usage

### CMakeLists.txt

```CMakeLists.txt
find_package(deep_core REQUIRED)

target_link_libraries(${YOUR_LIBRARY}
deep_core::deep_core_lib
)
```

### Creating an Inference Node

**Inherit from `DeepNodeBase`** - gets automatic plugin loading and model management

Key lifecycle callbacks to override:
- `on_configure_impl()` - Set up subscribers, publishers, services
- `on_activate_impl()` - Start processing (DeepNodeBase handles plugin/model loading)
- `on_deactivate_impl()` - Stop processing
- `on_cleanup_impl()` - Clean up resources

**DeepNodeBase automatically handles:**
- Loading backend plugin based on `Backend.plugin` parameter
- Loading model based on `model_path` parameter
- Bond connections if `Bond.enable` is true
- Calling your `*_impl()` methods after base functionality

**Your node just needs to:**
- Set up ROS interfaces (topics, services, actions)
- Process incoming data using `run_inference(Tensor)`
- Handle your specific business logic

Don't forget: `RCLCPP_COMPONENTS_REGISTER_NODE(your_namespace::YourNode)`

### Creating a Backend Plugin

1. **Implement three classes inheriting from:**

- `BackendMemoryAllocator` - Handle memory allocation/deallocation for your hardware
- `BackendInferenceExecutor` - Load models and run inference in your ML framework
- `DeepBackendPlugin` - Return instances of your allocator and executor

Key methods to implement:
- Allocator: `allocate()`, `deallocate()`, `allocator_type()`
- Executor: `load_model()`, `run_inference()`, `unload_model()`, `supported_model_formats()`
- Plugin: `backend_name()`, `get_allocator()`, `get_inference_executor()`

Don't forget: `PLUGINLIB_EXPORT_CLASS(YourPlugin, deep_ros::DeepBackendPlugin)`

1. **Create `plugins.xml`:**

```xml
<library path="my_backend_lib">
<class name="my_backend" type="MyBackendPlugin" base_class_type="deep_ros::DeepBackendPlugin">
<description>My custom backend</description>
</class>
</library>
```

## Testing
All tests for `deep_core` exist in the `deep_test` package. This is to expose test headers to other packages downstream in a centralized way.
47 changes: 47 additions & 0 deletions deep_core/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# deep_core

Core abstractions for ML inference in ROS 2 lifecycle nodes.

## Overview

Provides:
- `Tensor`: Smart pointer for tensor data with custom memory allocators
- `DeepNodeBase`: Lifecycle node base class with plugin loading and optional bond support
- Plugin interfaces for backend inference engines and memory management

## Key Components

### Tensor
Multi-dimensional tensor smart pointer supporting:
- Custom memory allocators (CPU/GPU/aligned memory)
- View semantics (wrap existing data without copying)
- Standard tensor operations (reshape, data access)

### DeepNodeBase
Lifecycle node that handles:
- Dynamic backend plugin loading via pluginlib
- Model loading/unloading lifecycle
- Optional bond connections for nav2 integration and integration with other lifecycle managers
- Parameter-driven configuration

### Plugin Interfaces
Deep_ROS abstracts away hardware acceleration interfaces as plugins. This means that users have the
freedom to switch between different hardware accelerators at runtime. The backend plugin interface is
as follows:
- `DeepBackendPlugin`: Abstract interface for defining a backend plugin. Must implement:
- `BackendMemoryAllocator`: Backend implementation for memory allocation and management
- `BackendInferenceExecutor`: Backend implementation for running model inference

## Configuration

All nodes inherenting `deep_ros::DeepNodeBase` have the following settable parameters.

Required parameters:
- `Backend.plugin`: Plugin name (e.g., "onnxruntime_cpu")
- `model_path`: Path to model file (dynamically reconfigurable on runtime,
you can switch model's while the node is running!)

Optional parameters:
- `Bond.enable`: Enable bond connections (default: false)
- `Bond.bond_timeout`: Bond timeout in seconds (default: 4.0)
- `Bond.bond_heartbeat_period`: Heartbeat period in seconds (default: 0.1)
Loading