Robotics researcher bridging control, perception, and machine learning, with applications in healthcare imaging and autonomous systems for food security.
I am a robotics researcher with a background in mechanical systems, autonomous control, and perception. My work addresses decision-making and motion in real environments, from lesion analysis in medical imaging to mobile manipulation in robotics.
My approach combines control theory, optimization, and data-driven modeling (machine learning). Experience comes from academic research, autonomous system projects, Kaggle competitions, and laboratory development.
I have also designed UAV platforms for farmland monitoring, crop mapping, and weed detection, addressing challenges in food security through robotics and automation.
I aim to bridge control, perception, and machine learning, with applications in healthcare imaging and autonomous systems for food security.
A pipeline for longitudinal analysis of dermoscopic lesions combining vision-language models with custom segmentation and temporal feature extraction. The approach borrows concepts from biomedical signal processing to quantify lesion evolution across timepoints. Multiple models were explored, including LLaVa-Med, GPT-based reasoning, MedGemma feature extraction, and the iToBoS change-detection framework.
Kaggle International Competition — 1st place among 16 teams (0.6731 mAP, IoU 0.5–0.75). Built a two-stage pipeline for detecting multiple skin lesions from clinical images produced by 3D total-body photography. The system combined YOLOv8 for bounding-box localization with MedViT for lesion classification, balancing accuracy, speed, and robustness to imaging noise.
Built an integrated pipeline that unified exploration, planning, perception, and manipulation on the Turtlebot–SwiftPro platform. Exploration was driven by Next Best View selection in OctoMap, with RRT–Dubins motion planning ensuring feasible navigation for the non-holonomic base. Manipulation was handled through a recursive task-priority controller on the SwiftPro arm, allowing pick-and-place tasks to run in parallel with navigation.
Developed a planning framework for a Turtlebot platform under non-holonomic constraints. Implemented an RRT-like planner that grows the tree in continuous space while enforcing forward motion and bounded turning radius. Dubins path primitives ensured trajectory feasibility by connecting sampled nodes with smooth curves instead of straight lines. The planner was integrated with ROS, consuming occupancy grid maps and generating feasible paths to navigation goals.
Implemented a recursive task-priority controller for a Turtlebot base with a 4-DOF Swift Pro arm in ROS 1. The solver projected lower-priority commands into the null space of higher-priority ones, handling equality and inequality tasks. Behavior trees structured pick-and-place pipelines, integrating perception, navigation, and manipulation into a single flow. ArUco detection enabled online target acquisition, while joint limit enforcement and base suppression ensured safe actuation.
Designed a localization pipeline on the Kobuki Turtlebot using a pose-based Extended Kalman Filter. The system fused encoder odometry, IMU yaw, and point clouds from a Realsense depth camera converted into 2D slices. Dead reckoning predicted motion through differential drive kinematics, while ICP alignment corrected accumulated drift by matching consecutive scans. IMU yaw was integrated as a pseudo-measurement to stabilize heading and guide ICP initialization. A Mahalanobis-based filter managed the state vector, keeping only statistically significant poses.
Segmented side-scan sonar into sand, mud, maerl, and rock with weak labels. Trained with noisy masks, refined pseudo-labels using CRF, and improved class balance with Lovász-Softmax loss. The GIF cycles through qualitative samples and training curves.
TETFund 2020 NRF Project – Mechatronics Research Group, UNN. A two-year national research project on UAV-based precision agriculture. My master's thesis work focused on the design and control of the drone, from CAD modeling of the airframe and embedded systems in SolidWorks to full kinematic, dynamic, and state-space implementation in Python. I proposed a hybrid controller combining model predictive control, feedback linearization, and backstepping for trajectory tracking. The platform was co-developed for farmland mapping, crop monitoring, and selective weed management, integrating onboard vision with a targeted spraying mechanism.
Nigeria's first electric campus shuttle, built at UNN. I led the CAD design team, developing the vehicle body and chassis models that supported fabrication and assembly.
Archives of Control Sciences, 2024
Nigerian Journal of Technology, 2024
Journal of Engineering Research and Reports, 2024
Integration of Cloud Computing and IoT (book chapter), Taylor & Francis Group, 2024
CIGR Journal, 2023
Journal of Engineering Research and Reports, 2023
Thin Films-Deposition Methods and Applications (book chapter), IntechOpen, 2023
solomon.nwafor@unn.edu.ng
u1999124@campus.udg.edu
(+34) 600961183