A R M I

About Us

The ARMI Laboratory is a research and innovation center in autonomous robots and machine intelligence at the Mechatronics Engineer Program (MEC), Cluster 3, Vietnamese German University (VGU), Vietnam.

ARMI Lab is dedicated to advancing autonomous robotics, digital twin technologies, and machine learning to solve complex, real-world industrial challenges. Our mission is to develop intelligent robotic systems that not only perceive and understand their environment, but also adapt, learn, and optimize their behavior autonomously in an industrial context.


Selected Research Publications

ROS based Digital Twin Framework for Collaboration Robot

Tri Bien Minh*, Phu Do, Hung Q. Nguyen, Khang H. V. Nguyen, Thao T.T Phan — Proceedings of the International Conference on Sustainable Energy Technologies

This paper presents a real-time machine health monitoring digital twin (DT) framework for robot machine-tending based on ISO 23247. The system integrates OPC UA and ROS for seamless communication between physical and cyber layers.

Springer LinkPreprintCode

Position-based Visual Servoing with Dual Manipulators

Tri Bien Minh*

The PBVS system uses a Realsense D435 camera to track ARTag trajectories from a collaborative UR10e robot. Joint velocity control enables real-time trajectory tracking using visual feedback.

VideoPreprint

Optimisation Algorithm for Reactive Motion Control Manipulator

Tri Bien Minh*

Designs a reactive optimization-based controller that avoids static and dynamic obstacles while driving the robot toward a target pose.

Video

LiDAR-based Vehicle Detection using DBSCAN

Tri Bien Minh*, Hien Vo Bich — ICCRI 2023

A LiDAR-based embedded detection system using DBSCAN clustering on Jetson TX2 for real-time object detection in road environments.

VideoPreprint

Robot Gesture Control Using Multi-Tracking System

Khang Hoang Vinh Nguyen, Tri Bien Minh, Van Chi Le, Phu Xuan Do — AETA 2022

A wireless gesture-controlled robot system using Vicon motion capture for real-time control without onboard computing or wearable electronics.

VideoPreprint

MiniRos: an autonomous UGV robot

Tri Bien Minh*, Hua Thanh Luan, Do Xuan Phu — ICSSE 2021

A ROS-based autonomous mobile robot for education and research with 2D SLAM, navigation, and outdoor capabilities.

V-frame Octocopter: Design & Simulation

Tri Bien Minh*, Hien Vo, Hua Thanh Luan — IJ Intelligent Unmanned Systems, 2023

A novel V-frame octocopter with FEM validation and Simulink PID tuning using Ziegler-Nichols method.

Adaptive Optimal Control for Upper Exoskeleton

Do Xuan Phu, Tri Bien Minh — ICMT 2021

A novel saturation-function-based optimal controller for upper-extremity exoskeletons, validated in simulation.

Robotics Project Portfolio

Upper body humanoid robot design

Upper body humanoid robot design (14 DOF)

14DOF humanoid upper body robot (6-DOF head + 2× 4-DOF arms), integrated with a depth camera. Designed for 3D printing or laser cutting — for research & education.

Autonomous Service Robot

Autonomous Service Robot

ROS-based service robot using 2D Lidar, embedded computer, and multiple sensors. Supports indoor SLAM and autonomous navigation for restaurants, hotels, and hospitals.

Docking and Locking System

Docking & Locking for Electric Bike

Award-winning charger docking and locking system for electric motorbikes — developed for the Bosch Green Challenge. Recognized for innovation in industrial design.

Ant-like robot

Ant-like Robot (23 DOF)

A six-legged, 23-DOF biomimetic robot with 3 DOF per leg. Uses servo motors and a lightweight frame — ideal for STEM education and locomotion research.

RC Humanoid Robot 19 DOF

RC Humanoid Robot (19 DOF)

A remote-controlled 19-DOF humanoid built for mechanical engineering and robotics education. Designed to be easy to manufacture via 3D printing or laser cutting.

Technical Capabilities

Programming & Frameworks

Core development in low-level and high-level languages for robotics and AI.



Skills: Python, C++, ROS, PyTorch, Git, LaTeX

Lab Tutorials & Curriculum

  • Robotics and Autonomous Systems (ROS, Pytorch)
  • Embedded Intelligent System (ROS, OpenCV)
  • Microcontroller / Digital Signal Processing
  • Robotics Workshop (CAD and PCB Design)

Hands-on Hardware Experience

  • Robot Platforms: UR10e, Kuka Youbot, Turtlebot 3, NAO, DJI Drone.
  • Sensors: Velodyne VLP-16, IMU-Xsens Mti-30, Intel Realsense, SICK Lidar.
  • Embedded Computers: Nvidia Jetson family, Raspberry Pi, NUC, Arduino.
  • Actuators: Servo motors, linear actuators, motor drivers.

ARMI V-Research Model

The ARMI V-Research Model is a modern adaptation of the classical V-Model, redesigned for autonomous robotics and machine intelligence workflows. It aligns verification (left side) with validation (right side), ensuring that every research activity—from requirements, system design, perception algorithms, ML model pipelines, to field testing—is tested against measurable criteria. This model guides ARMI Lab in developing safe, reliable, and high-performance robotic systems using digital twins, simulation-driven development, ML lifecycle management, and continuous sim-to-real improvement.

ARMI V-Research Model Diagram

The model emphasizes four key principles:

  • Simulation-first development using digital twins (Unity, Isaac Sim, Gazebo) for safe testing, rapid iteration, and synthetic data generation.
  • Integrated ML lifecycle driving perception, planning, and decision-making modules through continuous training, monitoring, and model validation.
  • Hardware-in-the-loop (HIL) and system-in-the-loop validation to ensure robust sim-to-real transfer and safety-critical performance.
  • Full verification & validation alignment so that research requirements, algorithms, and system behaviors always map to measurable real-world tests, acceptance KPIs, and field trials.

This V-Research framework ensures that ARMI Lab produces scalable robotic technologies with strong scientific rigor, repeatability, and industrial-grade robustness — from early concept development to deployment in real autonomous robots.