Thesis

Investigation of novel sensing and human-robot collaboration for enhanced quality inspections

Creator
Rights statement
Awarding institution
  • University of Strathclyde
Date of award
  • 2026
Thesis identifier
  • T17639
Person Identifier (Local)
  • 202190985
Qualification Level
Qualification Name
Department, School or Faculty
Abstract
  • The integration of collaborative robots (Cobots) and automated inspection stages in industrial manufacturing remains challenging due to the complexity of quality control processes, particularly in liquid spreading, painting, and coating applications. Existing quality control methods suffer from significant inconsistencies and operational inefficiencies, with manual inspection typically exhibiting error rates of 30–50 percent and throughput losses that can reduce effective productivity by up to 40 percent. Fully automated systems, while faster, often lack the adaptability and flexibility required to assess both surface defects and thickness variations in real time. This thesis investigates the development of Human-Robot Collaboration (HRC), sensing, and vision system to enhance real-time quality control inspection of liquid spreading or coating applications in manufacturing environments. Specifically, it presents a ResNet-101–based CNN vision system and a capacitive sensing approach, with a UR10e Cobot employed to assist in sensor deployment and spatial scanning, tested on a case study focused on liquid spreading and coating uniformity assessment. The thesis presents several key contributions. It introduces a novel sensor fusion approach that integrates capacitive sensing for thickness consistency inspection, validated through a main case study performed in collaboration with Unilever and another case study performed to inspect coating mocking real industrial inspection which can be in automotive or aerospace industrial field. In this contribution, an adaptive regression model for thickness estimation is proposed, achieving a Mean Squared Error (MSE) of 0.00699 and an R-squared (R²) value of 0.915. Both metrics demonstrate the robustness and predictive capability of the model in non-destructive thickness measurement beyond existing techniques. Additionally, a customized ResNet-101 model is developed and optimized for liquid spreading defect classification using a dataset comprising 6,900 images. The model achieved a training accuracy of 97.99% and a validation accuracy of 100%. While such performance is unusually high for real-world image classification tasks, this outcome is attributed to the controlled nature of the dataset, the clear visual distinctions between the Full, Fault, and Empty classes, and the relatively limited size and variability of the validation set. These factors reduce dataset complexity and increase the likelihood of achieving near-perfect accuracy. Nonetheless, this also indicates that further testing on larger and more diverse datasets is required to fully validate the model’s robustness and rule out the possibility of dataset bias or inadvertent information leakage. The sensing and vision systems are utilized to inspect the liquid spreading on a mimic substrate, where the capacitive sensor checks the thickness consistency and the vision system classifies the spreading based on surface images into three categories: full, empty, and fault. Furthermore, for the sensing component specifically, the approach was extended to test coating over a metal plate, mimicking coating processes in the automotive industry and other industrial coating applications. The results demonstrated that the approach could be adopted for coating classification, successfully categorizing coatings into four distinct classes: no coating, full coating, excessive coating, and defective coating. To ensure seamless integration with industrial workflows, the system is deployed on a collaborative robot (UR10e), leveraging automated motion planning and real-time data synchronization for defect detection and thickness assessment. Furthermore, as the research is HRC-based, a human operator with significant responsibilities collaborates in the inspection process, paving the way for full in process quality control inspection cell adoption in industrial settings. The human operator is responsible for monitoring the inspection process and taking action based on obtained results. Finally, these contributions are applied to the inspection of liquid spreading and coating processes in industrial settings, demonstrating HRC-based quality control inspection system that autonomously detects defects, assesses thickness variations, and provides real-time feedback to the operator for process optimization. The system can be easily adapted to various manufacturing applications, providing scalable, and flexible solutions for industries such as automotive, aerospace, and healthcare. These advancements collectively address the limitations of traditional quality control methods, paving the way for broader adoption of AI-driven, HRC based inspection systems in modern manufacturing.
Advisor / supervisor
  • Yang, Erfu
  • Luo, Xichun (Manufacturing teacher)
Resource Type
DOI

Relazioni

Articoli