{"id":190494,"date":"2024-06-02T12:23:16","date_gmt":"2024-06-02T17:23:16","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2024\/06\/a-3d-ray-traced-biological-neural-network-learning-model"},"modified":"2024-06-02T12:23:16","modified_gmt":"2024-06-02T17:23:16","slug":"a-3d-ray-traced-biological-neural-network-learning-model","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2024\/06\/a-3d-ray-traced-biological-neural-network-learning-model","title":{"rendered":"A 3D ray traced biological neural network learning model"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/a-3d-ray-traced-biological-neural-network-learning-model.jpg\"><\/a><\/p>\n<p>In artificial neural networks, many models are trained for a narrow task using a specific dataset. They face difficulties in solving problems that include dynamic input\/output data types and changing objective functions. Whenever the input\/output tensor dimension or the data type is modified, the machine learning models need to be rebuilt and subsequently retrained from scratch. Furthermore, many machine learning algorithms that are trained for a specific objective, such as classification, may perform poorly at other tasks, such as reinforcement learning or quantification.<\/p>\n<p>Even if the input\/output dimensions and the objective functions remain constant, the algorithms do not generalize well across different datasets. For example, a neural network trained on classifying cats and dogs does not perform well on classifying humans and horses despite both of the datasets having the exact same image input<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 1\" title=\"Jakubovitz, D., Giryes, R., Rodrigues, M. R. D. Generalization error in deep learning. In Compressed Sensing and Its Applications, Third International MATHEON Conference 2017 153&ndash;195 (Birkh\u00e4user, 2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR1\" id=\"ref-link-section-d187483241e397\">1<\/a><\/sup>. Moreover, neural networks are highly susceptible to adversarial attacks<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 2\" title=\"Chen, S.-T., Cornelius, C., Martin, J. & Chau, D. H. P. Shapeshifter: robust physical adversarial attack on faster R-CNN object detector. In Machine Learning and Knowledge Discovery in Databases 52&ndash;68 (Springer, 2018).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR2\" id=\"ref-link-section-d187483241e401\">2<\/a><\/sup>. A small deviation from the training dataset, such as changing one pixel, could cause the neural network to have significantly worse performance. This problem is known as the generalization problem<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 3\" title=\"Jiang, Y. et al. Methods and analysis of the first competition in predicting generalization of deep learning. In Proc. NeurIPS 2020 Competition and Demonstration Track 170&ndash;190 (PMLR, 2021).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR3\" id=\"ref-link-section-d187483241e405\">3<\/a><\/sup>, and the field of transfer learning can help to solve it.<\/p>\n<p>Transfer learning<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Tan, C. et al. A survey on deep transfer learning. In 27th International Conference on Artificial Neural Networks and Machine Learning 270&ndash;279 (2018).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR4\" id=\"ref-link-section-d187483241e412\">4<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Zhuang, F. et al. A comprehensive survey on transfer learning. Proc. IEEE 109, 43&ndash;76 (2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR5\" id=\"ref-link-section-d187483241e412_1\">5<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Agarwal, N., Sondhi, A., Chopra, K., Singh, G. Transfer learning: Survey and classification. Editors (Smart Innovations in Communication and Computational Sciences. Proceedings of ICSICCS 2020): Tiwari, S., Trivedi, M., Mishra, K., Misra, A., Kumar, K., Suryani, E. 1,168145&ndash;155 (Springer, 2021).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR6\" id=\"ref-link-section-d187483241e412_2\">6<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Shao, L., Zhu, F. & Li, X. Transfer learning for visual categorization: a survey. IEEE Trans. Neural Netw. Learn. Syst. 26, 1019&ndash;1034 (2014).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR7\" id=\"ref-link-section-d187483241e412_3\">7<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Liang, H., Fu, W., Yi, F. A survey of recent advances in transfer learning. In IEEE 19th International Conference on Communication Technology 1516&ndash;1523 (IEEE, 2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR8\" id=\"ref-link-section-d187483241e412_4\">8<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Niu, S., Liu, Y., Wang, J. & Song, H. A decade survey of transfer learning (2010&ndash;2020). IEEE Trans. Artificial Intell. 1151&ndash;166 (2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR9\" id=\"ref-link-section-d187483241e412_5\">9<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 10\" title=\"Nguyen, C. T. et al. Transfer learning for wireless networks: a comprehensive survey. Proc. IEEE 110, 1073&ndash;1115 (2022).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR10\" id=\"ref-link-section-d187483241e415\">10<\/a><\/sup> solves the problems presented above by allowing knowledge transfer from one neural network to another. A common way to use supervised transfer learning is obtaining a large pre-trained neural network and retraining it for a different but closely related problem. This significantly reduces training time and allows the model to be trained on a less powerful computer. Many researchers used pre-trained neural networks such as ResNet-50<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 11\" title=\"Wu, Z., Shen, C. & Hengel, A. V. D. Wider or deeper: revisiting the ResNet model for visual recognition. Pattern Recognition 90119&ndash;133 (2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR11\" id=\"ref-link-section-d187483241e419\">11<\/a><\/sup> and retrained them to classify malicious software<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Rezende, E., Ruppert, G., Carvalho, T., Ramos, F., Geus, P. D. Malicious software classification using transfer learning of ResNet-50 deep neural network. In 16th IEEE International Conference on Machine Learning and Applications 1011&ndash;1014 (IEEE, 2017).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR12\" id=\"ref-link-section-d187483241e423\">12<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Jiao, W., Wang, Q., Cheng, Y. & Zhang, Y. End-to-end prediction of weld penetration: a deep learning and transfer learning based method. J. Manuf. Process. 63191&ndash;197 (2021).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR13\" id=\"ref-link-section-d187483241e423_1\">13<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Du, H., He, Y. & Jin, T. Transfer learning for human activities classification using micro-Doppler spectrograms. IEEE International Conference on Computational Electromagnetics 1&ndash;3 (IEEE, 2018).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR14\" id=\"ref-link-section-d187483241e423_2\">14<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 15\" title=\"Rismiyati, Endah, S. N., Khadijah, Shiddiq, I. N. Xception architecture transfer learning for garbage classification. In 4th IEEE International Conference on Informatics and Computational Sciences 1&ndash;4 (IEEE, 2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR15\" id=\"ref-link-section-d187483241e426\">15<\/a><\/sup>. Another application of transfer learning is tackling the generalization problem, where the testing dataset is completely different from the training dataset. For example, every human has unique electroencephalography (EEG) signals due to them having distinctive brain structures. Transfer learning solves the generalization problem by pretraining on a general population EEG dataset and retraining the model for a specific patient<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Zhang, R. et al. Hybrid deep neural network using transfer learning for EEG motor imagery decoding. Biomed. Signal Process. Control 63, 102144&ndash;102151 (2021).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR16\" id=\"ref-link-section-d187483241e430\">16<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Wan, Z., Yang, R., Huang, M., Zeng, N. & Liu, X. A review on transfer learning in EEG signal analysis. Neurocomputing 421, 1&ndash;14 (2021).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR17\" id=\"ref-link-section-d187483241e430_1\">17<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Zheng, W.-L. & Lu, B.-L. Personalizing EEG-based affective models with transfer learning. In Proc. Twenty-Fifth International Joint Conference on Artificial Intelligence 2732&ndash;2738 (AAAI, 2016).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR18\" id=\"ref-link-section-d187483241e430_2\">18<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Salem, M., Taheri, S. & Shiun-Yuan, J. ECG arrhythmia classification using transfer learning from 2-dimensional deep CNN features. In IEEE Biomedical Circuits and Systems Conference 1&ndash;4 (IEEE, 2018).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR19\" id=\"ref-link-section-d187483241e430_3\">19<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 20\" title=\"Van Steenkiste, G., Loon, G. & Crevecoeur, G. Transfer learning in ECG classification from human to horse using a novel parallel neural network architecture. Sci. Rep. 10, 1&ndash;12 (2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR20\" id=\"ref-link-section-d187483241e433\">20<\/a><\/sup>. As a result, the neural network is dynamically tailored for a specific person and can interpret their specific EEG signals properly. Labeling large datasets by hand is tedious and time-consuming. In semi-supervised transfer learning<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Wang, Y. et al. Transfer learning for semi-supervised automatic modulation classification in ZF-MIMO systems. IEEE J. Emerg. Select. Top. Circuits Syst. 10231&ndash;239 (2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR21\" id=\"ref-link-section-d187483241e437\">21<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Cheplygina, V., Bruijne, M. & Pluim, J. P. Not-so-supervised: a survey of semi-supervised, multi-instance, and transfer learning in medical image analysis. Med. Image Anal. 54280&ndash;296 (2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR22\" id=\"ref-link-section-d187483241e437_1\">22<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Wei, W., Meng, D., Zhao, Q., Xu, Z. & Wu, Y. Semi-supervised transfer learning for image rain removal. In Proc. IEEE\/CVFConference on Computer Vision and Pattern Recognition 3877&ndash;3886 (IEEE, 2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR23\" id=\"ref-link-section-d187483241e437_2\">23<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 24\" title=\"Al Ghamdi, M., Li, M., Abdel-Mottaleb, M. & Abou Shousha, M. Semi-supervised transfer learning for convolutional neural networks for glaucoma detection. In 44th IEEE International Conference on Acoustics, Speech and Signal Processing 3812&ndash;3816 (Institute of Electrical and Electronics Engineers Inc., 2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-48747-7#ref-CR24\" id=\"ref-link-section-d187483241e440\">24<\/a><\/sup>, either the source dataset or the target dataset is unlabeled. That way, the neural networks can self-learn which pieces of information to extract and process without many labels.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In artificial neural networks, many models are trained for a narrow task using a specific dataset. They face difficulties in solving problems that include dynamic input\/output data types and changing objective functions. Whenever the input\/output tensor dimension or the data type is modified, the machine learning models need to be rebuilt and subsequently retrained from [\u2026]<\/p>\n","protected":false},"author":661,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3,41,6],"tags":[],"class_list":["post-190494","post","type-post","status-publish","format-standard","hentry","category-biological","category-information-science","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/190494","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/661"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=190494"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/190494\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=190494"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=190494"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=190494"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}