← Back to papers

Paper deep dive

Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI

Joao Manoel Herrera Pinheiro, Gabriela Do Nascimento Herrera, Alvaro Doria Dos Santos, Luciana Bueno Dos Reis Fernandes, Ricardo V. Godoy, Eduardo A. B. Almeida, Helena Carolina Onody, Marcelo Andrade Da Costa Vieira, Angelica Maria Penteado-Dias, Marcelo Becker

Year: 2026Venue: arXiv preprintArea: cs.CVType: PreprintEmbeddings: 53

Abstract

Abstract:Accurate taxonomic identification of parasitoid wasps within the superfamily Ichneumonoidea is essential for biodiversity assessment, ecological monitoring, and biological control programs. However, morphological similarity, small body size, and fine-grained interspecific variation make manual identification labor-intensive and expertise-dependent. This study proposes a deep learning-based framework for the automated identification of Ichneumonoidea wasps using a YOLO-based architecture integrated with High-Resolution Class Activation Mapping (HiResCAM) to enhance interpretability. The proposed system simultaneously identifies wasp families from high-resolution images. The dataset comprises 3556 high-resolution images of Hymenoptera specimens. The taxonomic distribution is primarily concentrated among the families Ichneumonidae (n = 786), Braconidae (n = 648), Apidae (n = 466), and Vespidae (n = 460). Extensive experiments were conducted using a curated dataset, with model performance evaluated through precision, recall, F1 score, and accuracy. The results demonstrate high accuracy of over 96 % and robust generalization across morphological variations. HiResCAM visualizations confirm that the model focuses on taxonomically relevant anatomical regions, such as wing venation, antennae segmentation, and metasomal structures, thereby validating the biological plausibility of the learned features. The integration of explainable AI techniques improves transparency and trustworthiness, making the system suitable for entomological research to accelerate biodiversity characterization in an under-described parasitoid superfamily.

Tags

ai-safety (imported, 100%)cscv (suggested, 92%)preprint (suggested, 88%)

Links

Your browser cannot display the PDF inline. Open PDF directly →

Intelligence

Status: not_run | Model: - | Prompt: - | Confidence: 0%

Entities (0)

No extracted entities yet.

Relation Signals (0)

No relation signals yet.

Cypher Suggestions (0)

No Cypher suggestions yet.

Full Text

52,756 characters extracted from source content.

Expand or collapse full text

Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI ⋆ João Manoel Herrera Pinheiro a , Gabriela do Nascimento Herrera b , Alvaro Doria dos Santos c , Luciana Bueno dos Reis Fernandes b , Ricardo V. Godoy a , Eduardo A. B. Almeida d , Helena Carolina Onody e , Marcelo Andrade da Costa Vieira a , Angélica Maria Penteado-Dias b and Marcelo Becker a,∗ a São Carlos School of Engineering, University of São Paulo, São Carlos, 13566590, São Paulo, Brazil b Department of Ecology and Evolutionary Biology, Federal University of São Carlos, São Carlos, 13565905, São Paulo, Brazil c Federal University of Tocantins, Porto Nacional, 77500000, Brazil d Department of Biology, University of São Paulo, Ribeirão Preto, 14040901, Brazil e State University of Piauí, Deputado Jesualdo Cavalcanti Campus, Corrente, 64980000, Brazil A R T I C L E I N F O Keywords: Arthropod Biodiversity Convolutional neural network Computer vision Entomology Hymenoptera Taxonomic identification XAI A B S T R A C T Accurate taxonomic identification of parasitoid wasps within the superfamily Ichneumonoidea is essential for biodiversity assessment, ecological monitoring, and biological control programs. How- ever, morphological similarity, small body size, and fine-grained interspecific variation make manual identification labor-intensive and expertise-dependent. This study proposes a deep learning–based framework for the automated identification of Ichneumonoidea wasps using a YOLO-based archi- tecture integrated with High-Resolution Class Activation Mapping (HiResCAM) to enhance inter- pretability. The proposed system simultaneously identifies wasp families from high-resolution images. The dataset comprises 3,556 high-resolution images of Hymenoptera specimens. The taxonomic distribution is primarily concentrated among the families Ichneumonidae (푛 = 786), Braconidae (푛 = 648), Apidae (푛 = 466), and Vespidae (푛 = 460). Extensive experiments were conducted using a curated dataset, with model performance evaluated through precision, recall, F1-score, and accuracy. The results demonstrate high accuracy of over 96% and robust generalization across morphological variations. HiResCAM visualizations confirm that the model focuses on taxonomically relevant anatomical regions, such as wing venation, antennae segmentation, and metasomal structures, thereby validating the biological plausibility of the learned features. The integration of explainable AI techniques improves transparency and trustworthiness, making the system suitable for entomological research to accelerate biodiversity characterization in an under-described parasitoid superfamily. 1. Introduction Imagine inhabiting a world in which more than 80% of species remain entirely unknown to science. This is the cur- rent state of our knowledge regarding Class Insecta (Mora et al., 2011; Stork, 2018). Although insects represent the most species-rich group of animals and account for over half of all described species (May, 1986; Resh and Cardé, 2009), our inventory of this diversity is still far from complete. Approximately one million species have been formally de- scribed, and scientists estimate that an additional 5.5 million species remain undiscovered and undescribed (Stork, 2018; Eggleton, 2020). We are currently facing a significant gap in insect taxonomy (Slade and Ong, 2023; Ong et al., 2025), a ⋆ Source code: https://github.com/joaomh/identification-of-Ichneumonoidea-waps-YOLO-2026 Dataset: https://zenodo.org/records/18501018 ∗ Corresponding author joao.manoel.pinheiro@usp.br (J.M.H. Pinheiro); becker@sc.usp.br (M. Becker) ORCID(s): 0009-0001-6192-7374 (J.M.H. Pinheiro); 0009-0000-0371-3012 (G.d.N. Herrera); 0000-0002-7997-4195 (A.D.d. Santos); 0009-0008-9329-4509 (L.B.d.R. Fernandes); 0000-0002-5323-9299 (R.V. Godoy); 0000-0001-6017-6364 (E.A.B. Almeida); 0000-0003-3570-8183 (H.C. Onody); 0000-0002-6038-7740 (M.A.d.C. Vieira); 0000-0002-8371-5591 (A.M. Penteado-Dias); 0000-0002-7508-5817 (M. Becker) problem exacerbated by the ongoing global decline in insect species (Wagner et al., 2021; Fenoglio et al., 2021). This decline has direct impacts on human well-being (Schowalter et al., 2018), as insects are a cornerstone of global biodiversity (Cardoso et al., 2020) and perform crucial ecosystem functions. These functions include pol- lination (Gabriel and Tscharntke, 2007), maintaining the health of agricultural ecosystems (Jankielsohn, 2018), nat- ural pest control (Pardo and Borges, 2020), and decompo- sition (Eggleton, 2020). Consequently, the accurate iden- tification of insect species is vital for effective biodiversity monitoring and ecological research. Furthermore, precise classification is essential to distinguish agricultural pests from beneficial organisms. Contrary to common perception, the vast majority of insects are not harmful to humans (Allison et al., 2023) The order Hymenoptera comprises ants, bees, and wasps and represents one of the most species-rich insect orders (Forbes et al., 2018). Members of this order play essential ecological roles, particularly as pollinators (Barbizan Sühs et al., 2009; Beggs et al., 2011). Among Hymenoptera, the Ichneumonoidea superfamily is one of the most di- verse in the Neotropics (Quicke, 2015; Yu et al., 2016, 2012). These wasps primarily parasitize larvae and pupae of holometabolous insects, although some groups can para- sitize adult arthropods and arachnid oothecae, contributing J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 1 of 14 arXiv:2603.16351v1 [cs.CV] 17 Mar 2026 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI to the maintenance of ecological balance (Quicke, 2015). The Ichneumonoidea superfamily comprises two major fam- ilies, Ichneumonidae and Braconidae. The Ichneumonidae, commonly known as Darwin wasps (Klopfstein et al., 2019), is a hyper-diverse family of par- asitoid wasps, with over 25,000 described species across 37 subfamilies and 1,450 genera (Yu et al., 2012; Quicke, 2015). Of these, 4,419 species have been described in the Neotropical region, and 955 have been recorded in Brazil. They are parasitoids of larvae and pupae of holometabolous insects, such as Coleoptera, Lepidoptera, and Hymenoptera, as well as other arthropods (Gauld and Bolton, 1988; Han- son and Gauld, 1995). Ichneumonidae have been compara- tively less utilized in biological control programs, although their parasitoid behavior can effectively regulate the abun- dance of other insects, including agricultural pests (Quicke, 2015). Taxonomically, the group poses significant difficul- ties, as recognition of subfamilies is complex, particularly compared to that of the Braconidae. Identification is often restricted to females, as males frequently lack distinctive diagnostic features (Butcher and Quicke, 2023). The Braconidae constitutes the second most diverse family within the Hymenoptera. This family includes over 21,000 described species across more than 1,100 genera, though these numbers represent only a fraction of their true global diversity (Yu et al., 2012; Quicke, 2015; Chen and van Achterberg, 2019). Due to their prevalence as parasitoids of other insects (Matthews, 1974), braconids play a pivotal role in terrestrial ecosystems and are extensively utilized as agents in biological control programs (Shaw and Huddle- ston, 1991). Taxonomically, the most reliable distinction is found in the wing venation: braconids almost invariably lack the second recurrent vein (2m-cu) in the fore wing, a vein that is typically present in ichneumonids (Quicke, 2015). Traditionally, insect identification has been the domain of expert entomologists, relying heavily on morphological examination under microscopes, detailed dichotomous keys, and extensive reference collections (Wipfler et al., 2016). This classical approach, while foundational to our under- standing of insect diversity, is inherently labor-intensive, time-consuming, and demands highly specialized training and years of experience (Magni et al., 2023). Deep learning, a rapidly evolving field within artificial intelligence, utilizes computational models composed of multiple processing layers to learn abstract data represen- tations (Goodfellow et al., 2016; Bishop and Bishop, 2023). This distinguishes deep learning from traditional statistical prediction approaches (Sarker, 2021). These methods have significantly advanced various domains, including image classification, semantic segmentation, object detection, and speech recognition (Shinde and Shah, 2018; Sharifani and Amini, 2023). The core principle involves discovering intri- cate structures in large datasets through the backpropagation algorithm, which dictates how a machine adjusts its internal parameters to compute representations across layers (LeCun et al., 2015; Zhao et al., 2024). Unlike traditional machine learning that relies on carefully engineered feature extrac- tors, deep learning automatically discovers the necessary representations from raw data (O’Mahony et al., 2020; Indolia et al., 2018) and has shown promising results across several application domains (Alzubaidi et al., 2021; Bhatt et al., 2021). While deep learning has received significant attention in other domains, its application in invertebrate monitoring and biodiversity research has been slow to develop (Christin et al., 2019). However, this has changed over the past decade, as deep learning has begun to revolutionize the fields of entomology and ecology (Weinstein, 2018; Li et al., 2021; Høye et al., 2021). Deep learning and computer vision offer potential solutions to the long-standing challenges of inef- ficient and labor intensive insect identification (Ärje et al., 2020; De Cesaro Júnior and Rieder, 2020; Teixeira et al., 2023; Gao et al., 2024), monitoring (Bilik et al., 2024), and pest detection (Wu et al., 2019; Barbedo, 2020; Batz et al., 2023; Passias et al., 2024). However, a critical gap remains in the literature regarding the taxonomic complexity of the Ichneumonoidea superfamily. In the context of evaluating automated identification sys- tems for hyper-diverse taxa, it is notable that some studies, such as the DiversityScanner (Wührl et al., 2022, 2024), have detected and identified 14 families for robot handling with a precision of 91.4%. However, in their evaluation of the Ichneumonoidea superfamily, only 246 images were used to assess the identification model’s performance. Furthermore, while the system employed class activation maps to visualize the features the neural network prioritized during identifica- tion, these heatmaps were primarily used for internal model validation rather than being systematically compared against established morphological keys. In the domain of parasitoid wasps, (Shirali et al., 2024) demonstrated the efficacy of deep learning for identifying the highly diverse and cryptic Diapriidae family, using a dataset of 2,257 images, their study compared three architectures, with the BEiTv2 transformer model achieving the highest accuracy of 96% for genus-level identification and 97% for sex determination, significantly outperforming YOLOv8 and ConvNeXt. In this study, we present a novel deep learning framework specifically designed for the automated identification of the hyper-diverse Ichneumonoidea superfamily. By leveraging transfer learning and benchmarking state-of-the-art archi- tectures, including YOLOv12 and YOLOv26. A critical component of this approach is the integration of Explain- able Artificial Intelligence (XAI) techniques, specifically HiResCAM, which provides high-resolution visual inter- pretations of the model’s internal decision-making process. These visualizations enable the identification of morpholog- ically relevant regions, such as wing venation and metasomal structures, that align with traditional taxonomic criteria, thereby enhancing the transparency and biological plausi- bility of the predictions. The dataset and source code are publicly available to ensure reproducibility and to support further research in biodiversity informatics. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 2 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 1: Specimens were retrieved from DCBU collection (Penteado-Dias and Fernandes, 2025). Imaged from (Pinheiro et al., 2026). 2. Materials and methods 2.1. Data collection The biological material for this study, focusing on the Ichneumonoidea, was sourced from the DCBU taxonomic collection at UFSCar. Following specimen retrieval, mor- phological documentation was conducted using a Leica M205C stereomicroscope paired with a K5C digital camera. The acquisition process was managed via LAS X software, while the final high-depth-of-field composites were gener- ated through digital image stacking in Helicon Focus. Fig- ure 1 illustrates the photographic workflow from specimen collection to final image processing. The Dataset of Parasitoid Wasps and Associated Hy- menoptera (DAPWH) (Herrera Pinheiro et al., 2026) com- prises high-resolution images of Hymenoptera specimens, with a primary focus on the families Ichneumonidae and Braconidae. The dataset contains a total of 3,556 images, of which more than 40% correspond to Ichneumonoidea wasps, as detailed in Table 1. Figure 2 shows some samples of these wasps. Table 1 Distribution of images per family in DAPWH. FamilyImages Ichneumonidae786 Braconidae648 Apidae466 Vespidae460 Megachilidae298 Chrysididae244 Andrenidae244 Pompilidae190 Bethylidae94 Halictidae75 Colletidae51 Total3,556 2.2. Model architecture For the automated identification of Ichneumonoidea, we selected the YOLOv12 (Tian et al., 2025) and YOLOv26 (Sapkota et al., 2026) architectures. These models represent the current state-of-the-art in object detection and classifi- cation, offering peak performance for complex biological datasets. Their selection was essential for processing the in- tricate morphological data found in the DAPWH dataset, as experimental tests revealed that the YOLO framework pro- vided the fastest training times among the evaluated archi- tectures (Pinheiro and Becker, 2026). We implemented the nano variants of both architectures, specifically yolov12n-cls and yolo26n-cls. 2.3. Data splitting and imaging rescale For the model development phase, the dataset was parti- tioned into three distinct subsets to ensure robust training and unbiased evaluation. Following established methodological conventions (Raschka, 2020), we allocated 70% of the total images for the training set, while the remaining 30% was divided equally, with 15% dedicated to validation during training and 15% reserved as an independent test hold- out set. This distribution ensures that the final performance metrics represent the model’s ability to generalize to unseen Ichneumonoidea specimens. The final partitioning of the dataset into training, validation, and test subsets is detailed in Table 2 Table 2 Dataset distribution by family after splitting. FamilyTrainValTestTotal Andrenidae1703638244 Apidae3266971466 Bethylidae65141594 Braconidae4539798648 Chrysididae1703638244 Colletidae357951 Halictidae52111275 Ichneumonidae550117119786 Megachilidae2084446298 Pompilidae1332829190 Vespidae3226969460 Total2,4845285443,556 Given the high-fidelity nature of the original stacked im- ages acquired with the Leica M205C system, spatial down- sampling was required to align with the neural network’s computational constraints. All images were rescaled to a fixed input dimension of 512 × 512 pixels for YOLO training. 2.4. Training and evaluation The training and evaluation of the models were per- formed on a high-performance workstation running Linux. The hardware configuration consisted of an AMD Ryzen 9 7900 CPU, 64GB of DDR5 RAM, and an NVIDIA RTX 4090 GPU with 24GB of VRAM with CUDA 13.1. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 3 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a)(b)(c) (d)(e)(f) (g)(h)(i) (j)(k)(l) Figure 2: Examples of samples in the DAPWH dataset (Herrera Pinheiro et al., 2026). (a)-(f) Braconidae; (g)-(l) Ichneumonidae. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 4 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI To quantify the classification performance of the devel- oped models, we employed a suite of standard evaluation metrics: Accuracy, Precision, Recall, and the F1-score, de- fined by Eqs. 1, 2, 3, and 4, respectively. These indicators are widely recognized as benchmarks for both image classifica- tion and object detection tasks (Lin et al., 2015). Accuracy = 푇 푃 + 푇 푁 푇 푃 + 푇 푁 + 퐹 푃 + 퐹 푁 ,(1) Precision = 푇 푃 푇 푃 + 퐹 푃 .(2) Recall = 푇 푃 푇 푃 + 퐹 푁 .(3) 퐹 1 = 2⋅ Precision⋅ Recall Precision + Recall .(4) where 푇 푃 , 푇 푁, 퐹 푃 , and 퐹 푁 represent true positives, true negatives, false positives, and false negatives, respectively. 2.5. Model interpretability While quantitative metrics such as accuracy, preci- sion, recall, and F1-score are statistical measures of perfor- mance, they do not reveal the neural network’s decision- making process. To ensure the taxonomic validity of the model’s predictions, we employed Explainable AI (XAI) techniques, specifically High-Resolution Class Activation Mapping (HiRes-CAM) (Draelos and Carin, 2021). HiRes- CAM computes element-wise importance scores to produce visualization maps that are strictly faithful to the model’s computations. This higher spatial precision allows us to verify whether the model is focusing on relevant morpho- logical diagnostic traits, such as specific wing venation patterns, rather than learning spurious correlations from the background. 2.6. Research workflow Figure 3 shows the overview of the proposed explainable identification framework. High-resolution images of Ichneu- monidae specimens are provided as input to the YOLOv26 model. The network extracts hierarchical feature represen- tations that capture discriminative patterns across convo- lutional layers. Finally, HiResCAM is applied to generate class-discriminative activation maps that highlight biolog- ically relevant regions, such as wing venation and thoracic structures, to support transparent and interpretable predic- tions. 3. Results and discussion 3.1. Model performance The performance evaluation conducted on the DAPWH test set indicates that both architectures achieve high levels of taxonomic discrimination for the Ichneumonoidea super- family. As summarized in Table 3, the YOLOv26 model demonstrated superior performance across all evaluated metrics. Specifically, YOLOv26 achieved a Top-1 Accuracy of 96.14%, representing a significant improvement over the 94.85% attained by the YOLOv12 variant. Regarding the model’s reliability in identifying complex morphological features, YOLOv26 reached a Precision of 93.43% and a robust Recall of 97.04%. The resulting 퐹 1 -score of 95.20% further confirms the model’s effectiveness in balancing false positives and negatives. Table 3 Performance Comparison of YOLO Classification Models on the DAPWH Test Set. ModelAccuracy Precision Recall퐹 1 YOLOV12 0.94850.91320.9429 0.9278 YOLOV26 0.96140.93430.9704 0.9520 Both models exhibited stable convergence over the 150 training epochs, with a rapid reduction in training loss during the initial iterations followed by gradual stabilization. For YOLOv12, the training loss decreased sharply within the first 20–30 epochs and asymptotically approached near-zero values, while the validation loss stabilized around 0.20 after early fluctuations. The Top-1 accuracy increased con- sistently, surpassing 0.95 in later epochs, whereas Top-5 ac- curacy rapidly saturated, remaining close to 1.00 throughout most of the training process. These trends indicate efficient feature learning and strong generalization capacity without evident signs of overfitting. Similarly, YOLOv26 (Fig. 4b) demonstrated fast conver- gence and improved stability during validation. The valida- tion loss exhibited slightly lower variance than YOLOv12 and converged to marginally lower values. Top-1 accu- racy steadily improved to approximately 0.97 in the fi- nal epochs, indicating robust ranking performance. Over- all, both architectures achieved high classification accuracy; however, YOLOv26 presented smoother validation behavior and slightly superior generalization performance. The normalized confusion matrices (Fig. 6) demonstrate strong class-level discrimination for both models, with dom- inant diagonal values indicating high per-family accuracy. For YOLOv12, most families achieved correct identification rates above 93%, including Bethylidae, Vespidae, Megachil- idae, Apidae, Chrysididae, and Ichneumonidae. Moderate confusion was observed for Colletidae and Halictidae, sug- gesting greater morphological similarity or class imbalance effects. Limited cross-family misclassification occurred pri- marily between taxonomically related groups, such as An- drenidae and Halictidae, and between Megachilidae and Colletidae. YOLOv26 showed improved overall discrimination across Apidae, Bethylidae, Halictidae, and Vespidae. Ichneumonidae achieved 97% accuracy, while Braconidae and Chrysididae remained above 94%. Although Colletidae remained less separable, cross-class confusion was generally lower than with YOLOv12. The concentration of high diagonal values J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 5 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 3: Research workflow. Figure 4: Training and validation performance of YOLOv12 model over 150 epochs. Figure 5: Training and validation performance of YOLOv26 model over 150 epochs. and the reduction of off-diagonal errors indicate enhanced generalization and more consistent inter-family boundary learning in YOLOv26. 3.2. Model interpretability The qualitative analysis of the learned representations and attention maps provides further insight into the model’s internal decision-making process. The feature activation maps extracted from intermediate convolutional layers re- veal that the network progressively encodes discriminative morphological patterns, emphasizing structural contours while suppressing background information. The diversity of activation responses across channels indicates hierarchical feature abstraction, ranging from low-level edge detection to higher-level morphological descriptors. In Fig. 7, the visualizations demonstrate that the model emphasizes critical structural contours, such as the wing segmentation for the family Ichneumonidae. For the family Braconidae, the visualization of intermediate convolutional layers demonstrates that the model also effectively sup- presses background noise. As shown in Fig. 8, the hierar- chical encoding process prioritizes diagnostically relevant anatomical regions, such as the metasomal segmentation. By capturing these multi-scale features, the convolutional layers enable the model to achieve high per-family accuracies of 97% for Ichneumonidae and 94% for Braconidae, as shown in the normalized confusion matrix (Fig. 6). 3.3. Ichneumonidae For the identification of Ichneumonidae, two situations were observed. In the first, the model likely relied on tra- ditional morphological characters used to distinguish the family, particularly those of the fore wing (Fig. 9 and Fig. 10). For example, the presence of the fore wing vein 2m-cu is a crucial character for identifying Ichneumonidae. This cor- responds to step 2 in the key for the separation of British and Irish Braconidae and Ichneumonidae (Broad et al., 2018). Another important wing character is the absence of vein RS+M forming the discosubmarginal cell, which is used in step 3 of the same key. Additionally, facial features were also captured, such as the convex face typical of Ichneumonidae (Fig. 11). In the second situation, however, the model appeared to rely on non-traditional diagnostic characteristics (Fig. 12). Instead of focusing on explicit structural features com- monly used in taxonomy, the model based its decisions on broader morphological patterns or overall visual similarity. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 6 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) Figure 6: Confusion matrix normalized. (a) YOLOv12; (b) YOLOv26. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 7 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) Figure 7: Representative feature maps samples extracted from intermediate convolutional layers for YOLOv26, illustrating the hierarchical encoding of morphological structures and texture patterns for Ichneumonidae. (a) Habitus lateral; (b) Head frontal J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 8 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) Figure 8: Representative feature maps samples extracted from intermediate convolutional layers for YOLOv26, illustrating the hierarchical encoding of morphological structures and texture patterns for Braconidae. (a) Habitus lateral; (b) Head frontal J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 9 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 9: HiResCAM visualizations for Ichneumonidae. The heatmaps demonstrate that the YOLOv26 architecture priori- tizes wing venation patterns, notably the discosubmarginal cell, aligning with established entomological keys. Figure 10: HiResCAM visualizations for Ichneumonidae. The heatmaps demonstrate that the YOLOv26 architecture priori- tizes wing venation patterns, particularly the second recurrent vein (2m-cu), aligning with established entomological keys. Figure 11: HiResCAM visualizations for Ichneumonidae, the heatmaps demonstrate that the YOLOv26 architecture pri- oritizes convex facial aligning with established entomological keys. This behavior highlights an opportunity to explore non- conventional or underemphasized characters that, while not formally incorporated into identification keys, could contain diagnostically significant information. 3.4. Braconidae The family Braconidae is characterized by several dis- tinct morphological features successfully captured by the Figure 12: HiResCAM visualizations for Ichneumonidae reveal that the YOLOv26 architecture identifies and prioritizes al- ternative morphological structures beyond those traditionally emphasized in dichotomous taxonomic keys. Figure 13: HiResCAM visualizations for Braconidae reveal that the YOLOv26 architecture identifies and prioritizes the absence of areolet and 2m-cu aligning with established entomological keys. YOLOv26 architecture. The most reliable taxonomic dis- tinction for this family is found in the wing venation (Athey et al., 2023). Unlike Ichneumonids, Braconids almost invari- ably lack the second recurrent vein (2m-cu) and the areolet in the forewing (Fig. 13). Furthermore, an essential diagnostic trait is the fusion of metasomal tergites 2 and 3, creating a rigid structural unit that is clearly visible in the lateral profiles of the specimens (Fig. 14). At the subfamily level, specialized mandibular structures serve as additional key identifiers. In specific groups, the mandibles are characteristically open and non- overlapping (Athey et al., 2023; Butcher and Quicke, 2023), as documented in the frontal view provided in Fig. 15 3.5. Apidae For Apidae identification, the model’s performance was evaluated in recognizing body regions and structures that are traditional diagnostic features crucial for the taxonomic J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 10 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 14: HiResCAM visualizations for Braconidae reveal that the YOLOv26 architecture identifies and prioritizes the fused metasomal aligning with established entomological keys. Figure 15: HiResCAM visualizations for Braconidae reveal that the YOLOv26 architecture identifies and prioritizes the nandibles open aligning with established entomological keys. classification of various bee tribes. A key example is the vari- ation in wing venation; the model accurately differentiated between the fully developed, complex venation typical of most apid groups (e.g., Fig. 16) and the significantly reduced or simplified patterns characteristic of stingless bees (e.g., Fig. 17) (Michener, 2007). Additionally, the model captured important head features, particularly the medial margin of the compound eyes and the relative proportions of the differ- ent regions (e.g., Fig 18), which are diagnostic at the genus level. Another key area highlighted by the model was the hind leg, with a focus on specialized structures involved in pollen transport (Fig. 20). The presence of either a scopa or a corbicula (Fig. 19) is a decisive trait for distinguishing tribes of Apidae (Michener, 2007). 4. Conclusion This study presented a YOLO-based deep learning frame- work for automated identification of Ichneumonoidea wasps. The proposed system achieved strong classification perfor- mance while maintaining computational efficiency suitable for real-time or near real-time applications. The incorporation of Grad-CAM provided critical in- sight into the model’s decision-making process. Visualiza- tion results indicate that the network consistently attends to biologically meaningful morphological features, including Figure 16: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and prioritizes the fully developed complex venation. Figure 17: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and prioritizes the fully developed complex venation. Figure 18: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and prioritizes the medial margin of the compound eyes. wing venation patterns, antennal morphology, and metaso- mal segmentation. This alignment between learned repre- sentations and taxonomic traits enhances model credibility and supports its applicability in scientific workflows. From a practical standpoint, the framework reduces de- pendency on expert taxonomists for routine identification tasks and offers scalable support for biodiversity surveys, J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 11 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 19: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and prioritizes the presence of either a scopa or a corbicula. Figure 20: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and prioritizes presence of either a scopa or a corbicula. ecological research, and biological control initiatives. More- over, the explainability component addresses a major limi- tation of black-box deep learning systems by enabling qual- itative validation of predictions. Future work may extend this framework to subfamily or genus identification and incorporate larger, more diverse datasets to improve generalization. Integration into mobile or field-deployable systems would further increase accessi- bility and real-world utility. Overall, the study demonstrates that combining modern object detection models with explainable AI techniques pro- vides a robust and interpretable solution for automated insect taxonomy. CRediT authorship contribution statement João Manoel Herrera Pinheiro: Writing, original draft, methodology, developed the dataset, formal analysis, soft- ware. Gabriela do Nascimento Herrera: Writing, original draft, methodology, developed the dataset, formal analysis, review the taxonomy of the insects. Alvaro Doria dos Santos: Provide images of Ichneumonoidea and Vespidae, writing, review and editing. Luciana Bueno dos Reis Fer- nandes: Provide images of Ichneumonoidea, writing, review and editing. Ricardo V. Godoy: Writing, formal analysis, review and editing. Eduardo A. B. Almeida: Provide im- ages of Apidae, Andrenidae and Halictidae, writing, review and editing. Helena Carolina Onody: Provide images of Ichneumonoidea and Vespidae, writing, review and edit- ing. Marcelo Andrade da Costa Vieira: Writing, formal analysis, supervision, review and editing. Angélica Maria Penteado-Dias: Provide images of Ichneumonoidea, fund- ing acquisition, data curation, writing, original draft, review, editing and supervision. Marcelo Becker: Writing, original draft, data curation, funding acquisition, review, editing and supervision. Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. Data availability The dataset is available at Zenodo (Herrera Pinheiro et al., 2026) and the source code used is available in the GitHub repository. Acknowledgment This work was supported by Fundação de Apoio à Física e à Química (FAFQ), Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) grant nº88887.002221/2024- 00 and nº88887.975224/2024–00, Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) grant nº2014/50940- 2, 2019/09215-6 and 2022/11451-2, Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) grant nº465562/2014-0, Instituto Nacional de Ciência e Tecnolo- gia dos Hymenoptera Parasitoides (INCT-HYMPAR). References Allison, J.D., Paine, T.D., Slippers, B., Wingfield, M.J. (Eds.), 2023. Forest Entomology and Pathology. 1 ed., Springer Cham. URL: https://doi. org/10.1007/978-3-031-11553-0, doi:10.1007/978-3-031-11553-0. Alzubaidi, L., Zhang, J., Humaidi, A.J., Al-Dujaili, A., Duan, Y., Al- Shamma, O., Santamaría, J., Fadhel, M.A., Al-Amidie, M., Farhan, L., 2021. Review of deep learning: concepts, cnn architectures, challenges, applications, future directions. Journal of Big Data 8, 53. URL: https:// doi.org/10.1186/s40537-021-00444-8, doi:10.1186/s40537-021-00444-8. Athey, K., Fernandez-Triana, J., Penteado-Dias, A., Quicke, D., Sharkey, M., 2023. 2023 key to the new world subfamilies of the family bra- conidae (hymenoptera). Canadian Journal of Arthropod Identification doi:10.3752/cjai.2023.49. Barbedo, J.G.A., 2020. Detecting and classifying pests in crops using proximal images and machine learning: A review. AI 1, 312–328. URL: https://w.mdpi.com/2673-2688/1/2/21, doi:10.3390/ai1020021. Barbizan Sühs, R., Somavilla, A., Köhler, A., Putzke, J., 2009. Vespídeos (hymenoptera, vespidae) vetores de pólen de schinus terebinthifolius raddi (anacardiaceae), santa cruz do sul, rs, brasil. Brazilian Journal of Biosciences 7, 138–143. Batz, P., Will, T., Thiel, S., Ziesche, T.M., Joachim, C., 2023. From identifi- cation to forecasting: the potential of image recognition and artificial in- telligence for aphid pest monitoring. Frontiers in Plant Science Volume 14 - 2023. URL: https://w.frontiersin.org/journals/plant-science/ articles/10.3389/fpls.2023.1150748, doi:10.3389/fpls.2023.1150748. Beggs, J.R., Brockerhoff, E.G., Corley, J.C., Kenis, M., Masciocchi, M., Muller, F., Rome, Q., Villemant, C., 2011. Ecological effects J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 12 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI and management of invasive alien vespidae. BioControl 56, 505– 526. URL: https://doi.org/10.1007/s10526-011-9389-z, doi:10.1007/ s10526-011-9389-z. Bhatt, D., Patel, C., Talsania, H., Patel, J., Vaghela, R., Pandya, S., Modi, K., Ghayvat, H., 2021. Cnn variants for computer vision: History, architecture, application, challenges and future scope. Electronics 10. URL: https://w.mdpi.com/2079-9292/10/20/2470, doi:10.3390/ electronics10202470. Bilik, S., Zemcik, T., Kratochvila, L., Ricanek, D., Richter, M., Zambanini, S., Horak, K., 2024. Machine learning and com- puter vision techniques in continuous beehive monitoring applica- tions: A survey. Computers and Electronics in Agriculture 217, 108560. URL: https://w.sciencedirect.com/science/article/pii/ S0168169923009481, doi:https://doi.org/10.1016/j.compag.2023.108560. Bishop, C., Bishop, H., 2023. Deep Learning: Foundations and Concepts. Springer International Publishing. Butcher, B., Quicke, D., 2023. The Parasitoid Wasps of South East Asia. CAB International. Cardoso, P., Barton, P.S., Birkhofer, K., Chichorro, F., Deacon, C., Fart- mann, T., Fukushima, C.S., Gaigher, R., Habel, J.C., Hallmann, C.A., Hill, M.J., Hochkirch, A., Kwak, M.L., Mammola, S., Ari Noriega, J., Orfinger, A.B., Pedraza, F., Pryke, J.S., Roque, F.O., Settele, J., Simaika, J.P., Stork, N.E., Suhling, F., Vorster, C., Samways, M.J., 2020. Scientists’ warning to humanity on insect extinctions. Biological Con- servation 242, 108426. URL: https://w.sciencedirect.com/science/ article/pii/S0006320719317823, doi:https://doi.org/10.1016/j.biocon. 2020.108426. Chen, X.x., van Achterberg, C., 2019. Systematics, phylogeny, and evo- lution of braconid wasps: 30 years of progress. Annual Review of En- tomology 64, 335–358. URL: https://w.annualreviews.org/content/ journals/10.1146/annurev-ento-011118-111856, doi:https://doi.org/10. 1146/annurev-ento-011118-111856. Christin, S., Hervet, E., Lecomte, N., 2019. Applications for deep learning in ecology. Methods in Ecology and Evolution 10, 1632–1644. URL: https://besjournals.onlinelibrary.wiley.com/doi/abs/10. 1111/2041-210X.13256, doi:https://doi.org/10.1111/2041-210X.13256, arXiv:https://besjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/2041-210X.13256. De Cesaro Júnior, T., Rieder, R., 2020. Automatic identification of insects from digital images: A survey. Computers and Electronics in Agri- culture 178, 105784. URL: https://w.sciencedirect.com/science/ article/pii/S0168169920311224, doi:https://doi.org/10.1016/j.compag. 2020.105784. Draelos, R.L., Carin, L., 2021. Use hirescam instead of grad-cam for faithful explanations of convolutional neural networks. URL: https: //arxiv.org/abs/2011.08891, arXiv:2011.08891. Eggleton, P., 2020. The state of the world’s insects. Annual Review of Environment and Resources 45, 61–82.URL: https://w.annualreviews.org/content/journals/10.1146/ annurev-environ-012420-050035,doi:https://doi.org/10.1146/ annurev-environ-012420-050035. Fenoglio, M.S., Calviño, A., González, E., Salvo, A., Videla, M., 2021. Urbanisation drivers and underlying mechanisms of terrestrial insect diversity loss in cities. Ecological Entomology 46, 757– 771.URL: https://resjournals.onlinelibrary.wiley.com/doi/ abs/10.1111/een.13041,doi:https://doi.org/10.1111/een.13041, arXiv:https://resjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/een.13041. Forbes, A.A., Bagley, R.K., Beer, M.A., Hippee, A.C., Widmayer, H.A., 2018. Quantifying the unquantifiable: why hymenoptera, not coleoptera, is the most speciose animal order. BMC Ecology 18, 21. URL: https: //doi.org/10.1186/s12898-018-0176-x, doi:10.1186/s12898-018-0176-x. Gabriel, D., Tscharntke, T., 2007. Insect pollinated plants benefit from organic farming. Agriculture, Ecosystems & Environment 118, 43–48. URL: https://w.sciencedirect.com/science/article/pii/ S0167880906001484, doi:https://doi.org/10.1016/j.agee.2006.04.005. Gao, Y., Xue, X., Qin, G., Li, K., Liu, J., Zhang, Y., Li, X., 2024. Application of machine learning in automatic image iden- tification of insects - a review.Ecological Informatics 80, 102539. URL: https://w.sciencedirect.com/science/article/pii/ S1574954124000815, doi:https://doi.org/10.1016/j.ecoinf.2024.102539. Gauld, I., Bolton, B., 1988. The Hymenoptera. British Museum (Natural History). Goodfellow, I., Bengio, Y., Courville, A., 2016. Deep Learning. Adaptive Computation and Machine Learning series, MIT Press. Hanson, P.E., Gauld, I.D., 1995. The Hymenoptera of Costa Rica: The Natural History Museum, London. Oxford University Press. URL: https://doi.org/10.1093/oso/9780198549055.001.0001, doi:10.1093/oso/ 9780198549055.001.0001. Herrera Pinheiro, J.M., do Nascimento Herrera, G., Bueno dos Reis Fernan- des, L., Doria dos Santos, A., Vilela de Godoy, R., Andrade Botelho de Almeida, E., Carolina Onody, H., Andrade da Costa Vieira, M., Maria Penteado-Dias, A., Becker, M., 2026. Dataset of parasitoid wasps and associated hymenoptera (dapwh). URL: https://doi.org/10.5281/ zenodo.18501018, doi:10.5281/zenodo.18501018. Høye, T.T., Ärje, J., Bjerge, K., Hansen, O.L.P., Iosifidis, A., Leese, F., Mann, H.M.R., Meissner, K., Melvad, C., Raitoharju, J., 2021.Deep learning and computer vision will transform entomology.Proceedings of the National Academy of Sciences 118, e2002545117.URL: https://w.pnas.org/ doi/abs/10.1073/pnas.2002545117,doi:10.1073/pnas.2002545117, arXiv:https://w.pnas.org/doi/pdf/10.1073/pnas.2002545117. Indolia, S., Goswami, A.K., Mishra, S., Asopa, P., 2018. Conceptual understanding of convolutional neural network- a deep learning approach. Procedia Computer Science 132, 679–688. URL: https: //w.sciencedirect.com/science/article/pii/S1877050918308019, doi:https://doi.org/10.1016/j.procs.2018.05.069.international Conference on Computational Intelligence and Data Science. Jankielsohn, A., 2018. The importance of insects in agricultural ecosys- tems. Advances in Entomology 06, 62–73. doi:10.4236/ae.2018.62006. Klopfstein, S., Santos, B.F., Shaw, M.R., Alvarado, M., Bennett, A.M., Dal Pos, D., Giannotta, M., Herrera Florez, A.F., Karls- son, D., Khalaim, A.I., et al., 2019. Darwin wasps: a new name heralds renewed efforts to unravel the evolutionary history of ichneumonidae. Entomological Communications 1, ec01006. URL: https://w.entomologicalcommunications.org/index.php/entcom/ article/view/ec01006, doi:10.37486/2675-1305.ec01006. LeCun, Y., Bengio, Y., Hinton, G., 2015. Deep learning. Nature 521, 436–444. URL: https://doi.org/10.1038/nature14539, doi:10.1038/ nature14539. Li, W., Zheng, T., Yang, Z., Li, M., Sun, C., Yang, X., 2021. Classification and detection of insects from field images using deep learning for smart pest management: A systematic review. Ecological Informatics 66, 101460. URL: https://w.sciencedirect.com/science/article/pii/ S157495412100251X, doi:https://doi.org/10.1016/j.ecoinf.2021.101460. Lin, T.Y., Maire, M., Belongie, S., Bourdev, L., Girshick, R., Hays, J., Perona, P., Ramanan, D., Zitnick, C.L., Dollár, P., 2015. Microsoft coco: Common objects in context. URL: https://arxiv.org/abs/1405.0312, arXiv:1405.0312. Magni, P.A., Harvey, A.D., Guareschi, E.E., 2023. Insects associated with ancient human remains: How archaeoentomology can provide additional information in archaeological studies. Heritage 6, 435–465. URL: https://w.mdpi.com/2571-9408/6/1/23, doi:10.3390/heritage6010023. Matthews, R., 1974. Biology of braconidae. Annual Review of Entomology 19, 15–32. doi:10.1146/annurev.en.19.010174.000311. May, R.M., 1986. Biological diversity: How many species are there? Nature 324, 514–515. URL: https://doi.org/10.1038/324514a0, doi:10.1038/ 324514a0. Michener, C., 2007. The Bees of the World. Johns Hopkins University Press. Mora, C., Tittensor, D.P., Adl, S., Simpson, A.G.B., Worm, B., 2011. How many species are there on earth and in the ocean? PLOS Biology 9, 1–8. URL: https://doi.org/10.1371/journal.pbio.1001127, doi:10.1371/ journal.pbio.1001127. O’Mahony, N., Campbell, S., Carvalho, A., Harapanahalli, S., Hernandez, G.V., Krpalkova, L., Riordan, D., Walsh, J., 2020. Deep learning vs. traditional computer vision, in: Arai, K., Kapoor, S. (Eds.), Advances J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 13 of 14 Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI in Computer Vision, Springer International Publishing, Cham. p. 128– 144. Ong, X.R., Tan, B., Chang, C.H., Puniamoorthy, N., Slade, E.M., 2025.Identifying the knowledge and capacity gaps in southeast asian insect conservation.Ecology Letters 28, e70038.URL: https://onlinelibrary.wiley.com/doi/ abs/10.1111/ele.70038,doi:https://doi.org/10.1111/ele.70038, arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/ele.70038. e70038 ELE-00548-2024.R4. Pardo, A., Borges, P.A., 2020.Worldwide importance of insect pollination in apple orchards: A review.Agriculture, Ecosystems & Environment 293, 106839.URL: https: //w.sciencedirect.com/science/article/pii/S0167880920300244, doi:https://doi.org/10.1016/j.agee.2020.106839. Passias, A., Tsakalos, K.A., Rigogiannis, N., Voglitsis, D., Papanikolaou, N., Michalopoulou, M., Broufas, G., Sirakoulis, G.C., 2024. Insect pest trap development and dl-based pest detection: A comprehensive review. IEEE Transactions on AgriFood Electronics 2, 323–334. doi:10.1109/ TAFE.2024.3436470. Penteado-Dias, A.M., Fernandes, L.B.d.R., 2025. Dcbu - coleção tax- onômica do departamento de ecologia e biologia evolutiva da ufscar. URL: https://doi.org/10.15468/xzkz3y, doi:10.15468/xzkz3y. Pinheiro, J.M.H., Becker, M., 2026. Deep learning-based computer vision techniques for automated identification of Ichneumonoidea and other Hymenoptera insects. Master’s thesis. Universidade de São Paulo. doi:10.11606/D.18.2026.tde-09022026-143242. Pinheiro, J.M.H., Herrera, G.D.N., Fernandes, L.B.D.R., Santos, A.D.D., Godoy, R.V., Almeida, E.A.B., Onody, H.C., Vieira, M.A.D.C., Penteado-Dias, A.M., Becker, M., 2026. Descriptor: Dataset of para- sitoid wasps and associated hymenoptera (dapwh). URL: https://arxiv. org/abs/2602.20028, arXiv:2602.20028. Quicke, D., 2015. The Braconid and Ichneumonid Parasitoid Wasps: Biology, Systematics, Evolution and Ecology. Wiley. Raschka, S., 2020. Model evaluation, model selection, and algorithm selection in machine learning. URL: https://arxiv.org/abs/1811.12808, arXiv:1811.12808. Resh, V., Cardé, R., 2009. Encyclopedia of Insects. Encyclopedia of Insects, Academic Press. Sapkota, R., Cheppally, R.H., Sharda, A., Karkee, M., 2026. Yolo26: Key architectural enhancements and performance benchmarking for real-time object detection. URL: https://arxiv.org/abs/2509.25164, arXiv:2509.25164. Sarker, I.H., 2021. Deep learning: A comprehensive overview on tech- niques, taxonomy, applications and research directions. SN Computer Science 2, 420. URL: https://doi.org/10.1007/s42979-021-00815-1, doi:10.1007/s42979-021-00815-1. Schowalter, T., Noriega, J., Tscharntke, T., 2018. Insect effects on ecosystem services—introduction. Basic and Applied Ecology 26, 1–7.URL: https://w.sciencedirect.com/science/article/pii/ S1439179117302207, doi:https://doi.org/10.1016/j.baae.2017.09.011. insect Effects on Ecosystem services. Sharifani, K., Amini, M., 2023. Machine learning and deep learning: A review of methods and applications. World Information Technology and Engineering Journal 10, 3897–3904. URL: https://ssrn.com/abstract= 4458723. available at SSRN: https://ssrn.com/abstract=4458723. Shaw, M., Huddleston, T., 1991. Classification and Biology of Braconid Wasps (Hymenoptera: Braconidae). Handbooks for the identification of British insects, Royal Entomological Society of London. Shinde, P.P., Shah, S., 2018. A review of machine learning and deep learning applications, in: 2018 Fourth International Conference on Com- puting Communication Control and Automation (ICCUBEA), p. 1–6. doi:10.1109/ICCUBEA.2018.8697857. Shirali, H., Hübner, J., Both, R., Raupach, M., Reischl, M., Schmidt, S., Pylatiuk, C., 2024. Image-based recognition of parasitoid wasps using advanced neural networks. Invertebrate Systematics 38. URL: https: //doi.org/10.1071/IS24011, doi:10.1071/IS24011. Slade, E.M., Ong, X.R., 2023.The future of tropical insect diversity: strategies to fill data and knowledge gaps. Current Opinion in Insect Science 58, 101063.URL: https: //w.sciencedirect.com/science/article/pii/S2214574523000603, doi:https://doi.org/10.1016/j.cois.2023.101063. Stork, N.E., 2018. How many species of insects and other terres- trial arthropods are there on earth? Annual Review of Ento- mology 63, 31–45. URL: https://w.annualreviews.org/content/ journals/10.1146/annurev-ento-020117-043348, doi:https://doi.org/10. 1146/annurev-ento-020117-043348. Teixeira, A.C., Ribeiro, J., Morais, R., Sousa, J.J., Cunha, A., 2023. A systematic review on automatic insect detection using deep learn- ing. Agriculture 13. URL: https://w.mdpi.com/2077-0472/13/3/713, doi:10.3390/agriculture13030713. Tian, Y., Ye, Q., Doermann, D., 2025. Yolo12: Attention-centric real-time object detectors. arXiv preprint arXiv:2502.12524 . Wagner, D.L., Grames, E.M., Forister, M.L., Berenbaum, M.R., Stopak, D., 2021. Insect decline in the anthropocene: Death by a thousand cuts.Proceedings of the National Academy of Sciences 118, e2023989118.URL: https://w.pnas.org/ doi/abs/10.1073/pnas.2023989118,doi:10.1073/pnas.2023989118, arXiv:https://w.pnas.org/doi/pdf/10.1073/pnas.2023989118. Weinstein, B.G., 2018.A computer vision for animal ecology.Journal of Animal Ecology 87, 533–545.URL: https://besjournals.onlinelibrary.wiley.com/doi/abs/10.1111/ 1365-2656.12780,doi:https://doi.org/10.1111/1365-2656.12780, arXiv:https://besjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/1365-2656.12780. Wipfler, B., Pohl, H., Yavorskaya, M.I., Beutel, R.G., 2016. A review of methods for analysing insect structures — the role of morphology in the age of phylogenomics. Current Opinion in Insect Science 18, 60–68. URL: https://w.sciencedirect.com/science/article/pii/ S2214574516301432, doi:https://doi.org/10.1016/j.cois.2016.09.004. neuroscience * Special Section on Insect phylogenetics. Wu, X., Zhan, C., Lai, Y.K., Cheng, M.M., Yang, J., 2019. Ip102: A large- scale benchmark dataset for insect pest recognition, in: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), p. 8779–8788. doi:10.1109/CVPR.2019.00899. Wührl, L., Pylatiuk, C., Giersch, M., Lapp, F., von Rintelen, T., Balke, M., Schmidt, S., Cerretti, P., Meier, R., 2022. Diversityscanner: Robotic handling of small invertebrates with machine learning methods. Molecular Ecology Resources 22, 1626–1638.URL: https://onlinelibrary.wiley.com/doi/abs/10. 1111/1755-0998.13567, doi:https://doi.org/10.1111/1755-0998.13567, arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/1755-0998.13567. Wührl, L., Rettenberger, L., Meier, R., Hartop, E., Graf, J., Pylatiuk, C., 2024. Entomoscope: An open-source photomicroscope for biodiversity discovery. IEEE Access 12, 11785–11794. doi:10.1109/ACCESS.2024. 3355272. Yu, D., van Achterberg, C., Horstmann, K., 2012. World Ichneumonoidea 2011: Taxonomy, Biology, Morphology and Distribution. Taxapad. Yu, D.S., van Achterberg, C., Horstmann, K., 2016. World ichneumonoidea 2015: Taxonomy, biology, morphology and distribution. Taxapad 2016. Zhao, X., Wang, L., Zhang, Y., Han, X., Deveci, M., Parmar, M., 2024. A review of convolutional neural networks in computer vision. Ar- tificial Intelligence Review 57, 99. URL: https://doi.org/10.1007/ s10462-024-10721-6, doi:10.1007/s10462-024-10721-6. Ärje, J., Raitoharju, J., Iosifidis, A., Tirronen, V., Meissner, K., Gabbouj, M., Kiranyaz, S., Kärkkäinen, S., 2020. Human experts vs. machines in taxa recognition. Signal Processing: Image Communication 87, 115917. URL: https://w.sciencedirect.com/science/article/pii/ S0923596520301132, doi:https://doi.org/10.1016/j.image.2020.115917. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier. Copyright may be transferred without notice.Page 14 of 14