- 1. Team
- 2. Project Overview
- 3. Key Features
- 4. Core Technologies
- 5. Technical Challenges and Solutions
- 6. System Design
- 7. Project Structure
- 8. Tech Stack
- 9. Project Management
- 10. License
📄 Looking for the Korean version? See
README.ko.md.
| Name | GitHub | Role |
|---|---|---|
| Jinhyuk Jang |
|
Project planning & overall leadership ROS2 architecture, package layout, and FSM design Vision AI model & Vision Service implementation ROS2 × PyQt Robot GUI development |
-
Project Goal
- Let the robot autonomously take over repetitive hotel operations so that staff workload is reduced and guests receive a novel, convenient service.
-
Project Timeline
- July 7, 2025 – August 13, 2025 (38 days)
| Key Stages | Description | Media |
|---|---|---|
| Order Placement |
▪ Guests open the Guest GUI via the QR code placed in each room. ▪ After reviewing the menu, the order is submitted → the Staff GUI receives the notification. ▪ Once the dish is ready, the Staff GUI sends a “Pickup Request.” |
|
| Pickup & Loading |
▪ Roomie drives to the restaurant pickup waypoint. ▪ ArUco marker detection aligns the robot precisely with the pickup spot. ▪ The Robot GUI shows the order list to prevent loading mistakes. ▪ Drawer control includes door open/lock sensors and a load-presence sensor. |
|
| In-room Delivery |
▪ Nav2-based navigation drives the robot to the room entrance. ▪ The destination ArUco marker confirms the exact door location. ▪ Arrival notifications are sent through both the Guest GUI and the Robot GUI. |
|
| Item Handover |
▪ Guests operate the Robot GUI to unlock the drawer and take the order. ▪ The robot returns to the standby area once the task is complete. |
| Key Stages | Description | Media |
|---|---|---|
| Request & Destination Input |
▪ Destination is auto-filled when the guest authenticates with the room card. ▪ Manual input is also available from the Guest GUI or Robot GUI. |
|
| Guest Identification |
▪ The rear camera detects the guest to be escorted. ▪ A DeepSORT-based target tracking algorithm follows the identified guest. |
|
| Guided Escort |
▪ The robot keeps a safe distance while guiding the guest to the destination. ▪ If the guest leaves the field of view, the robot pauses. ▪ The mission resumes automatically once the guest is detected again. |
| Key Stages | Description | Media |
|---|---|---|
| Elevator Call |
▪ The Vision Service extracts button coordinates. ▪ The Arm Controller presses the lobby call button. |
|
| Boarding & Interior Interaction |
▪ The robot centers itself with the door before boarding. ▪ Arm motion is driven by the size and coordinates of the floor buttons. ▪ OCR on the overhead display confirms arrival at the target floor. |
|
| Exit | ▪ After arriving at the destination floor, the robot centers itself and exits safely. |
| Key Stages | Description | Media |
|---|---|---|
| Dashboard |
▪ Track the number of active jobs and robots in real time. ▪ Draw robot positions on the 2D map. |
|
| Robot Management | ▪ Monitor each robot’s position, task assignment, and battery level. | |
| Job History | ▪ Review job lists and per-task logs. |
- Hardware
- Servo motors
- 2D camera
- Button-click end effector
- Button detection
- Compute base → wrist → camera → button coordinates
- Motion sequence
- Observation pose → pre-push pose → button press → confirmation
- Control method
- Apply a Gaussian velocity/acceleration profile to minimize jitter
- Nav2-based navigation
- Generate and follow global/local paths
- Waypoint-driven path creation
- Match depth-camera obstacles to pre-defined waypoints
- Use the A* algorithm to compute the optimal route
- Dynamic obstacle handling
- Detect obstacles via the depth camera in real time
- Stop within a threshold distance and resume once the path clears
- RTR motion (Rotate–Translate–Rotate)
- Provides precise alignment and backward motion inside elevators
- YOLOv8n object detection
- Obstacles: static, dynamic, glass doors
- Elevators: buttons, displays, doors, direction indicators
- Accuracy boosters
- CNN classifies detailed button types
- EasyOCR reads the floor indicator
- Person tracking
- YOLOv8n detects people
- DeepSORT tracks a specific guest and publishes coordinates
- Drawer-door detection
- Measure the distance between the sensor and door
- If the distance exceeds 5.0 cm, the drawer is considered open
- Load detection
- Side-mounted sensors measure the internal width
- If the distance is below 25.0 cm, cargo is detected
- RFID card reader
- MFRC522 reads the UID of each RFID card
- Interprets the 4-byte value stored in block 4 as
location_id - Publishes
success=true+ the location value on success;success=false,location_id=-1otherwise
- RGB LED status
-
Control the LED color based on the
RobotState -
💡 View the control logic
State ID State name RGB LED 0 INITIALCyan 1, 2, 11, 13, 21, 23 CHARGING,WAITING,PICKUP_WAITING,DELIVERY_WAITING,GUIDE_WAITING,DESTINATION_SEARCHINGGreen 10, 12, 20, 22, 30, 31 PICKUP_MOVING,DELIVERY_MOVING,CALL_MOVING,GUIDE_MOVING,RETURN_MOVING,ELEVATOR_RIDINGBlue 90 ERRORRed
-
- Problem: Constant-velocity control created small jitter at the end effector.
- Solution: Applied Gaussian velocity/acceleration profiles to smooth out the motion.
- Problem: The robot could not re-route when corridors were narrow or blocked.
- Solution: Added waypoint-based detours and ran the A* algorithm to compute bypass paths in advance.
- Problem: Adding more classes to YOLO degraded accuracy.
- Solution: YOLOv8n generates the ROI, a CNN classifies the button, and EasyOCR interprets the floor indicator.
- Problem: Plain Nav2 navigation struggled with button pressing and precise alignment.
- Solution: Introduced RTR (Rotate–Translate–Rotate) patterns to enable fine alignment and backward motion.
User Requirements
[Priority Legend]
- `R` : Required implementation
- `O` : Optional implementation
| UR_ID | UR_NAME | UR Description | Condition | Required |
|---|---|---|---|---|
| Guest | ||||
| UR_01 | Call the robot | Request the robot to move to a specific location | Callable from: ▪ Lobby ▪ Guest room ▪ Restaurant |
O |
| UR_02 | Guided escort | The robot guides the guest to a destination while carrying luggage | Supported locations: ▪ Guest room ▪ Lobby ▪ Restaurant |
O |
| UR_03 | Personalized responses | Provide greetings in the guest’s preferred language | Triggered when: ▪ Guidance ends ▪ Delivery handover completes |
O |
| UR_04 | Deliver amenities | Deliver requested items to the room | Items: ▪ Food & beverage: spaghetti, pizza ▪ Supplies: toothbrush, towel, bottled water, cutlery |
R |
| UR_05 | Real-time progress tracking | Display the status of each requested job | Includes: ▪ Processing state ▪ Current position ▪ Estimated arrival time |
R |
| UR_06 | Guest notifications | Notify the guest about job progress | Cases: ▪ Robot call: assigned, departed, arrived ▪ Guidance: started, finished ▪ Delivery: pickup arrival, pickup done, delivery arrival, received ▪ Failure alerts with reasons (blocked path, guest lost, collision, etc.) |
R |
| Administrator | ||||
| UR_07 | Job status management | Monitor every ongoing job | Includes: ▪ Current status ▪ Job ID ▪ Job type ▪ Failure indicator and reasons |
O |
| UR_08 | Job history | Browse the entire job history | Filters: ▪ Job type ▪ Status ▪ Guest ID ▪ Room number |
O |
| UR_09 | Job priority control | Reorder queued jobs | - | O |
| UR_10 | Robot information | Maintain robot-specific metadata | Fields: ▪ Robot ID ▪ Model name ▪ Manufacture date |
O |
| UR_11 | Robot status | Track the current state of each robot | Fields: ▪ Location ▪ Battery level ▪ Charging state ▪ Assigned job ID ▪ System errors |
O |
System Requirements
[Priority Legend]
- `R` : Required implementation
- `O` : Optional implementation
| SR_ID | SR_NAME | SR Description | Condition | Priority |
|---|---|---|---|---|
| SR_01 | Robot call | Call the robot to a specific location | Available at: - Room entrance (ROOM_XX) - Restaurant (RES_2) - Lobby (LOB_2) |
R |
| SR_02 | Autonomous movement | Robots travel autonomously to execute or finish jobs | Job types: - Call - Guidance - Delivery - Food & beverage - Amenities |
R |
| SR_02_01 | Path creation | Robot generates its own route to the target | - | R |
| SR_02_02 | Obstacle avoidance | Detect and avoid obstacles while driving | Obstacles: - Static: tables, chairs, trash bins - Dynamic: people |
R |
| SR_02_03 | Collision detection | Pause when a collision is detected | Determine collisions via IMU thresholds | R |
| SR_02_04 | Tip-over detection | Detect rollovers and alert an admin | Determine tip-over via IMU thresholds | O |
| SR_02_05 | Following confirmation | Make sure the guest is following during guidance | - | R |
| SR_03 | Inter-floor travel | Use the elevator by calling it and pushing buttons | - | R |
| SR_03_01 | Elevator call | Summon the elevator to the current floor | Methods: - API call - Physical manipulation with the arm |
R |
| SR_03_02 | Floor selection | Select the target floor after boarding | Methods: - API call - Physical manipulation with the arm |
R |
| SR_03_03 | Elevator boarding | Board when the elevator arrives | Factors: - Direction - Position - Door state |
R |
| SR_03_04 | Elevator exit | Exit when reaching the destination floor | Factors: - Position - Door state |
R |
| SR_04 | In-job notifications | Notify guests while the job is running | Provide status updates per call/guidance/delivery and include failure reasons | R |
| SR_05 | Personalized responses | Play multilingual voice prompts at the start/end of tasks | Cases: - Guidance start - Call arrival - Guidance end - Delivery handover |
R |
| SR_06 | Guidance request | Start guidance after reading the guest card key | Available at: - Room entrance - Restaurant - Lobby |
R |
| SR_06_01 | Guest appearance recognition | Detect the guest’s appearance for tracking | Use the camera | O |
| SR_06_02 | Destination input | Provide multiple destination input methods | In-room: auto-filled from card / manual / voice / touchscreen Elsewhere: restaurant / lobby |
O |
| SR_07 | Delivery request | Request item delivery from the room | Delivery types: - Food (spaghetti, pizza) - Amenities (toothbrush, towel, bottled water, cutlery) |
O |
| SR_08 | Load items | Staff load items at the pickup station | Capacity: up to two rooms | O |
| SR_08_01 | Load confirmation | Verify the items before departure | Flow: - IR sensor pre-check → staff “Load Confirm” → departure countdown |
R |
| SR_09 | Delivery tracking | Provide real-time delivery status to guests | Includes: - Progress stage - Current position - ETA |
O |
| SR_10 | Job data management | Manage job types and statuses | Track call/guidance/delivery lifecycle | R |
| SR_10_01 | Job history lookup | Allow admins to query all jobs | Fields: - ID - Type - Status |
R |
| SR_10_02 | Job monitoring | Provide job information to staff in real time | - | R |
| SR_10_03 | Job reorder | Manually change the queue order | - | R |
| SR_10_04 | Auto dispatch | Auto-assign queued jobs to idle robots | - | O |
| SR_11 | Auto return | Return to the lobby after jobs | Conditions: - Return on completion/cancellation - Move to the charger when needed |
O |
| SR_12 | Robot info management | Manage robot ID, model, manufacture date | - | R |
| SR_12_01 | Robot lookup | Let admins filter/search the robot list | - | R |
| SR_13 | Robot state management | Manage per-robot state | Position, battery, charging state, job ID, error | R |
| SR_13_01 | Robot state monitoring | Provide real-time state data to admins | - | R |
| SR_13_02 | Collision alerts | Notify admins when collisions occur | Linked to SR_02_03 | R |
| SR_13_03 | State history | Review charging and collision logs | Charging ID/time, collision location/time | R |
| SR_14 | Auto charging | Auto-dock based on battery level | Docking-station charging | R |
| SR_14_01 | Low-battery return | Return to standby when battery <20% | - | R |
System Scenario
System Architecture
An overview of robots, GUIs, servers, and their communication flows
Elevators are notorious for unstable or non-existent network connectivity.
To stay reliable, Roomie embeds the Vision Service on the robot (on-device AI) so that buttons, doors, and floor indicators can be recognized offline.
Interface Specification
Roomie/
├── ros2_ws/ # Shared ROS2 workspace
│ ├── build/ # Created by colcon build
│ ├── install/
│ ├── log/
│ └── src/
│ ├── micro_ros_setup/ # micro-ROS build tools
│ ├── roomie_msgs/ # Shared messages (msg/srv/action)
│ ├── roomie_rc/ # Robot Controller node (RC)
│ ├── roomie_rgui/ # Robot GUI node (RGUI)
│ ├── roomie_vs/ # Vision Service node (VS)
│ ├── roomie_rms/ # Main server node (RMS)
│ ├── roomie_agui/ # Admin GUI node (AGUI)
│ ├── roomie_ac/ # Arm Controller node (AC)
│ └── bringup/ # Integrated launch files
│
├── esp32_firmware/ # ESP32 firmware for micro-ROS
│ ├── arm_unit/ # Servo control firmware for the arm
│ │ └── src/
│ └── io_controller/ # Sensor, drawer, LED control
│ └── src/
│
├── gui/ # Non-ROS GUI apps
│ ├── staff_gui/ # Staff GUI
│ └── guest_gui/ # Guest GUI
│
├── assets/ # Images and resources
│ └── images/
│
├── docs/ # Design documents
│ ├── architecture/ # System architecture
│ ├── interface.md # Communication interface definitions
│ └── state_diagram/ # State diagram
│
├── .gitignore
├── README.md
└── LICENSE
| Category | Technologies |
|---|---|
| ML / DL | |
| GUI | |
| Network & Protocol | |
| Robotics | |
| Environment |
|
|
▪ Managed via Jira with six sprints. ▪ Organized the backlog with Epic → Task hierarchies. |
|
|
▪ Documented the workflow in Confluence across planning, design, research, implementation, and testing. ▪ Logged progress at regular intervals. |
This project is licensed under the Apache License 2.0.
Refer to the LICENSE file for details.