| 
							
      					 | 
  					 
  					
    					 | 
   					 
   										
    					| Loop Closure Detection Based on Improved NetVLAD  Image Feature Extraction | 
  					 
  					  										
						| QIU Changbin, WANG Qingzhi, LIU Qipeng
 | 
					 
															
						| Institute of Complexity Science, Qingdao University, Qingdao 266071, China | 
					 
										
						 | 
					 
				 
				
				
					
						
							
								
									
										
											
                        					 
												
													
													    | 
													    	
														 | 
													 
																										
													
														
															
													
													    | 
													     		                            						                            																	    Abstract  Traditional loop closure detection algorithms mostly utilize handcrafted features to represent images. The robustness of dealing environmental changes such as illumination and viewpoint changes is vulnerable, and the features extraction is time-consuming, which cannot meet the real-time requirements. To address these problems, we propose an improved algorithm which uses deep neural network to extract more robust image features. Specifically, the atrous spatial pyramid pooling (ASPP) module is introduced into the classic NetVLAD model to characterize the image. By the fusion of multi-scale features, the feature maps have fewer dimension and higher resolution, and thus, more accurate and compact image features are obtained. Experiments on public datasets show that the proposed algorithm has higher precision and recall rate. It can deal with the changes of illumination and viewpoint to a certain extent, and has less time cost in extracting image features.
																										     | 
														 
														
														
															| 
															    																	Received: 29 August 2022
																	    
															    															    															    																	Published: 28 December 2023
															    															 | 
														 
														 														
															| 
																
															 | 
														 
																																									    																														 
															 | 
															
																
															 | 
													    	
															 | 
																
															
														 
														
													 
													
												 
												
												
												
											
											 
											
											 
										 
									 | 
								 
							 
						 | 
					 
				 
			
		 |