Research @ ARC
Our Projects
Final Year Projects
Final Year Projects
FYP FAQ
Q: I am interested in Robotics. How do I do a FYP related to Robotics?
A: We are currently not accepting any new FYP students since the semester has started. Meanwhile, you can join our projects under a different module/programme such as UROP or IWP by contacting Prof Marcelo at mpeangh(a)nus.edu.sg. Alternatively, you may wait for the next round of FYP selection process.
Q: Is there a deadline for applications?
A: Yes, the deadline depends on your home department or any other programme you are under and this have been made known to you commonly via email. We also have an internal deadline of 15th May 2020. Additionally, note that the topics here are allocated on a First-Come-First-Serve basis so please contact us ASAP if you are interested in a topic.
Q: Are the topics listed for individual FYP? Would I be working alone for these projects?
A: It depends on the topics but most topics here are part of a larger project. This means that you will be working alongside other staff researchers and students in the projects. However, FYPs are graded at an individual level.
Q: Are the FYPs limited to the specialisation students or FoE students only?
A: No. We welcome all students.
Q: I am really interested in a specific field in robotics which is not listed here. Can I self-propose my own topic?
A: Please contact Prof Marcelo at mpeangh-spamblocktag-(a)nus.edu.sg.
Q: My question is not listed here. Is there anyone I can ask?
For further inquiries, email robotics-spamblocktag-(a)nus.edu.sg.
Collaborative Robots and Manipulator Arms
Collaborative Robots and Manipulator Arms
Mobile Manipulator
The combined manipulator arm and base platform can be operated autonomously and remotely to assist human in collaborative environment. The manipulator arm is a Kuka (LBR iiwa 14 R820) with 7 degree of freedom (DOF) mounted with Robotiq 3 finger gripper. The base platform is omnidirectional with zero radius turn capabilities, configured to be highly maneuverable in order to cater the different workspace environment. The base system can also be mounted with front and bottom camera for obstacle/object detection. The setup is being used in exploration of Human Robot Interaction for workshop environment, under grant from A* research.
Autonomous Mobile Robots
Autonomous Mobile Robots
Indoor Self-Driving Vehicles
Current research outcomes of self-driving vehicles (like cars, buses and trucks) are largely implemented for public open roads. These high-speed autonomous navigational application use cases generally are still governed by highway codes and rule-based driving regulations.
Within indoor commute of pedestrians at high-density, where people move in a more organic and spontaneous pathways along corridors and at times even cross-path or against human traffic flow, autonomous navigation and travel will pose a different set of challenges. In addition, indoor corridors could be narrow and include obstacles, an Autonomous Wheelchair to effectively and efficiently able to make this user trip, will require enhanced sensory fusion perception capabilities to better achieve its advanced path planning and localization processes.
The targeted research outcome is achieving a socially graceful navigation in pedestrian (non-road) environments using multiple sensory and fusion technologies and AI (i.e., Deep Learning). New sensory fusion methods, localization and navigation techniques will be demonstrated.
Furthermore, navigating in a socially acceptable manner is an increasing demand by the public; therefore, the social science observation of human-robot interaction between, will be even more required for applicational use. Finally, narrow corridors and crowded and dynamic environments will need beyond current and even next-gen sensory technologies to navigate in such tight spaces.
Autonomous Vehicles
Autonomous Vehicles
Singapore Autonomous Bus (SGAB) Dataset
The SGAB dataset focuses on perception tasks: Detection, Tracking and Prediction; and was developed to accomplish these tasks. With images and lidar point clouds covering the urban environment mounted on a bus traversing Singapore Roads. The choice of a bus as the ego vehicle presents new challenges such as a larger blindspot area, sparse lidar point clouds amongst others.
Dataset development
During this drive, the sensor data was collected using the Robot Operating System (ROS) and saved as a rosbag file. Subsequently, the following data were extracted from the rosbag at 20 Hz:
+ radar points,
+ camera images and
+ lidar point clouds.
The data was then organised into 37 sets, each containing a continuous-time sequence of 200s. The data was further downsampled to 2Hz for manual ground truth labelling of the car, truck, pedestrian, bicycle, motorbike, PMD, and bus classes. Each object was labelled with a consistent and unique identity within a set, such that the trajectories of the objects can be extracted.
Data formatting
Each set is contained in a folder containing the following type of files:
+ folder of camera images at 20Hz
+ folder of .pcd point cloud files at 20Hz
+ JSON file with radar data at 20Hz
+ JSON file with labelled objects in Top-Down view in Universal Transverse Mercator (UTM) 48N frame at 2 Hz
+ JSON file with labelled objects in Top-Down view in ego frame at 2Hz
+ CSV file of occluded objects, which are present in the frame but not (these objects can be tracked but not detected)
Each type of file are sorted into their respective folders, with files from the same set sharing the same name.
Acknowledgement
When using this dataset in your research, do use the following to cite us:
@Article{machines10030202,
AUTHOR = {Chong, Yue Linn and Lee, Christina Dao Wen and Chen, Liushifeng and Shen, Chongjiang and Chan, Ken Kok Hoe and Ang, Marcelo H.},
TITLE = {Online Obstacle Trajectory Prediction for Autonomous Buses},
JOURNAL = {Machines},
VOLUME = {10},
YEAR = {2022},
NUMBER = {3},
ARTICLE-NUMBER = {202},
URL = {https://www.mdpi.com/2075-1702/10/3/202},
ISSN = {2075-1702},
DOI = {10.3390/machines10030202}
}
This research is supported by the National Research Foundation, Prime Ministers Office, Singapore under its LTA Urban Mobility Grand Challenge Program, Project Code UM01/002, ST Kinetics Autonomous Bus Trial.
Download dataset
Please fill in your email to get the download link.
Autonomous Vehicles
Self-Driving Buses
This project aims to develop the perception stack for the self-driving bus. The 12-meter bus should be able to perform typical feeder services autonomously. The focus of the perception module we are developing for this autonomous bus aims to detect and track all dynamic actors in the environment and predict the future location of each actor. Using the predicted future positions of the obstacles, the bus is then able to plan safer and more comfortable paths.
This project is done in collaboration with ST Engineering Land Systems.
Others
Others
Validation of the Scalability of Blockchain
Phase 3 is now EXTENDED! Reinstall the app for more rewards!
System Maintenance on 6 December 2023, 9 am – 3 pm
Purpose of this program
This mobile phone application, NUS Walker, is created as part of a joint research project between NUS Advanced Robotics Centre (NUS-ARC) and Denso Corporation to validate the scalability of block chain for mobility services in NUS community.
After installing NUS Walker, the application will send data from the mobile phone application to a server. We will then study the traffic flow of data, density, and how these affect network efficiency and scalability of block-chain enabled services.
Data that will be collected includes daily steps taken, GPS data and UserID. UserID is an ID created by NUS Walker automatically when you download the application. NUS Walker will not capture any personal particulars/information of the person who downloaded and installed the app.
Data Collected
The data will be collected on the following dates and timings.
Date: 16 October 2023, 9:00 – 20 December 2023, 23:59
Time: 24 hours (Deactivate upon app closure)
Note!
– Data will be kept on our servers until the end of the research project 31 January 2024.
– In the event of unforeseen system trouble, data collection on NUS Walker may be stopped temporarily without prior notice.
– Do remember to keep NUS Walker running in the background. The step counter on NUS Walker will not be counting your steps if NUS Walker is not running in the background during the data collection timings and you will not be able to earn the reward points.
– The end of data collection may be extended depending on the participation rate.
– Please refer to this page for the latest updates: NUSWalker
Reward Points and Voucher Redemption
Each participant will be given some welcome points to start this program. Additional reward points will be earned by users, based on the number of physical steps taken per day, in line with the encouragement for a healthy lifestyle through walking. The reward points will translate to vouchers that can be redeemed.
Reward points structure:
– Welcome Bonus Points: 200 points
– Every 200 steps: 1 point
– Maximum of 50 points/ day
– 100 referral points will be rewarded for every referred user (You can refer up to 5 people)
– During Boosting period (30 October 2023, 9:00 to 20 December 2023, 23:59(Singapore Time)), the points and maximum points will be DOUBLE!!
Rewards Redemption
Each participant will be given some welcome points to start this program when they download the mobile phone application. Additional points will be earned by users, based on the number of physical steps taken per day, in line with the encouragement for a healthy lifestyle through walking. The points will translate to rewards vouchers that can be redeemed.
Rewards can be redeemed at the NUS Advanced Robotics Centre (NUS-ARC).
Vouchers available for redemption and the respective reward points needed:
5 SGD | 10 SGD | |
Starbucks Gift Card | N.A. | 800 points |
GrabRides Digital Gift Card | 500 points | 800 points |
GrabFoods Digital Gift Card | 500 points | 800 points |
Fairprice Voucher | 500 points | 800 points |
Vouchers can be redeemed at NUS-ARC. Please bring your phone with NUS Walker installed to scan a QR code to redeem the voucher.
Advanced Robotics Centre
National University of Singapore
Faculty of Engineering
5 Engineering Drive 1
Block E6, Level 7, E6-07-01
Singapore 117608
Collection Date and Timings
Date: 16 October 2023 – 21 December 2023
Time: 14:00 – 18:00 (weekdays)
Note!
– Type of vouchers may change depending on availability.
– If the end date of data collection is extended, the end date of rewards redemption will also be extended accordingly.
Please refer to NUS-ARC website for the latest updates: https://arc.nus.edu.sg/.
Participation criteria
Any NUS student/staff or any referee invited by NUS student/staff with an Android mobile phone (OS: Android Version 10 and above) and is willing to download this application is eligible to participate in this research.
Withdrawal from program
You have the right to withdraw from the program at any time by deleting the app. We will stop collecting all data. Any data already collected will remain part of the program. If you wish to have your existing data collected discarded, you can do so by contacting the research team.
Be sure to redeem your rewards before you withdraw. You will not be able to do this after you withdraw.
Contact us
For any clarifications on research-related matters, you may contact the research team at robotics@nus.edu.sg
Soft Robotics
Soft Robotics
Soft End-Effectors
A sensorized robotic manipulator system is developed for
(i) agriculture such as fruit and vegetable picking and sorting,
(ii) food packaging such as but not limited to cakes, fruits, pies, etc.,
(iii) advanced manufacturing such as handling glassware, electronic components, etc., and
(iv) service robots.
It provides safe, compliant and efficient grasping which leads to improvements in grasping performance of tasks deemed previously to be too delicate for traditional robotic end-effector.
Soft Robotics
Soft Actuators
The figure on the left shows some examples of soft actuators. The top one is made from silicone elastomer; the middle one is made from nylon fabrics; the bottom one is made from 3D printing. These actuators are driven by pneumatic power. When certain pneumatic pressures are applied to the prototypes, they can have different actuation profiles. The top actuator has a helix shape and the other two actuators bend in a plane.
Marine Robotics
Marine Robotics
Unmanned Surface Vehicle (USV)
The USV (BBASV 2.0) can operate independently and cooperatively to perform tasks remotely and autonomously on surface water. The motors are placed in vector configuration for holonomic drive, thus providing the vehicle with 4 degree of freedom (DOF). It also has a Launch and Recovery System (LARS) that allows for autonomous deployment of AUV from the ASV, allowing them to work in tandem. The vehicle is made to withstand up to Sea-state 3. The system of the vehicle can be easily configured to cater the specific needs of different stakeholders. The NUS Bumblebee USV team participated in the 2018 Maritime RobotX Challenge in Hawaii and emerged as Champions.
Marine Robotics
Autonomous Underwater Vehicles (AUV)
The AUV can be operated autonomously and remotely to perform tasks in the water with 6 degree of freedom (DOF), up to 100 meter deep. It has a front facing and bottom facing camera, along with a front facing sonar for long distance object detection. It navigate using its on board Inertia Measurement Unit (IMU), Doppler Velocity Logger (DVL), and Acoustics System. The vehicle is highly modular to be configured in order to cater the different underwater challenges. The NUS Bumblebee AUV team achieved second position at the 2018 International Robosub Competition held in San Diego, USA.
ARC Research
Advanced Robotics Centre (ARC) is an interdisciplinary research centre, established jointly by NUS Engineering and NUS Computing, with a focus on human-robot collaborative systems.
The vision of ARC is to become a prominent centre of robotics research and a sought-after resource by the industry and the society in Singapore and the region.
The mission of ARC is to advance the state of the art in robotics research, and to develop novel robotic platforms and application areas with high impact in productivity and innovation in the industrial ecosystem and improving the quality of our lives.
Please click on the links in the sidebar to view our research projects.