Show simple item record

dc.contributor.authorΖαχαρής, Γεώργιος
dc.date.accessioned2024-05-27T09:05:00Z
dc.date.available2024-05-27T09:05:00Z
dc.date.issued2024-02
dc.identifier.urihttps://dspace.uowm.gr/xmlui/handle/123456789/4772
dc.description.abstractThe flourishing of Artificial Intelligence (AI) in recent years, mainly due to the technological advancements resulting in high-performing hardware that boosted the field’s rise, has led to the research and development of many real-world applications. By extension, since the field’s theoretical beginnings are now applied to solve real-world problems, Machine Learning (ML), a subcategory of AI has been proven highly advantageous for computer vision tasks including image classification. This led to the development of various high-performing image classification Neural Networks (NN), each one with a different architectural approach. Through Transfer Learning (TL) these networks can be used for the development of real-world applications. However, such applications come with challenges that require an NN performing a task to be highly efficient, accurate, fast, stable generalized, and as less computational powerconsuming as possible. There is constant research to improve models by designing innovative architectures through various tools and techniques, including activation functions. This work focuses on improving popular pre-trained image classification NNs of high architecture and performance by altering the activation functions they use in their core. The models are trained for five datasets, each time with a different activation function in their entirety of architecture. Nine activation functions were chosen for testing. The experiments show optimistic results as improvements in performance in terms of accuracy or training time are possible and in many cases to a high extent.en_US
dc.description.sponsorshipΕπιβλέπων Καθηγητής : Πλόσκας Νικόλαοςen_US
dc.language.isoenen_US
dc.publisherΖαχαρής, Γεώργιοςen_US
dc.subjectMachine Learningen_US
dc.subjectNeural Networksen_US
dc.subjectActivation Functionsen_US
dc.subjectDeep Learningen_US
dc.subjectTensorFlowen_US
dc.subjectKerasen_US
dc.titleΜελέτη και σύγκριση συναρτήσεων ενεργοποίησης σε νευρωνικά δίκτυαen_US
dc.title.alternativeStudy and comparison of activation functions for neural networksen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record