{"id":12377,"date":"2024-12-16T18:08:35","date_gmt":"2024-12-16T18:08:35","guid":{"rendered":"https:\/\/www.tun.com\/home\/?p=12377"},"modified":"2024-12-16T20:16:26","modified_gmt":"2024-12-16T20:16:26","slug":"fau-researchers-develop-ai-system-for-real-time-american-sign-language-interpretation","status":"publish","type":"post","link":"https:\/\/www.tun.com\/home\/fau-researchers-develop-ai-system-for-real-time-american-sign-language-interpretation\/","title":{"rendered":"FAU Researchers Develop AI System for Real-Time American Sign Language Interpretation"},"content":{"rendered":"\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-uagb-blockquote uagb-block-e7eb3fc3 uagb-blockquote__skin-border uagb-blockquote__stack-img-none\"><blockquote class=\"uagb-blockquote\"><div class=\"uagb-blockquote__content\">FAU&#8217;s groundbreaking study leverages AI to interpret American Sign Language in real-time, ensuring more inclusive communication for the deaf and hard-of-hearing. The study achieved remarkable accuracy rates, showcasing the potential for practical, real-time applications.<\/div><footer><div class=\"uagb-blockquote__author-wrap uagb-blockquote__author-at-left\"><\/div><\/footer><\/blockquote><\/div>\n\n\n\n<div class=\"wp-block-group is-content-justification-space-between is-nowrap is-layout-flex wp-container-core-group-is-layout-0dfbf163 wp-block-group-is-layout-flex\"><div style=\"font-size:16px;\" class=\"has-text-align-left wp-block-post-author\"><div class=\"wp-block-post-author__content\"><p class=\"wp-block-post-author__name\">The University Network<\/p><\/div><\/div>\n\n\n<div class=\"wp-block-uagb-social-share uagb-social-share__outer-wrap uagb-social-share__layout-horizontal uagb-block-ee584a31\">\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-ec619ce7\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.facebook.com\/sharer.php?u=\" tabindex=\"0\" role=\"button\" aria-label=\"facebook\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M504 256C504 119 393 8 256 8S8 119 8 256c0 123.8 90.69 226.4 209.3 245V327.7h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.3 482.4 504 379.8 504 256z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-32d99934\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/twitter.com\/share?url=\" tabindex=\"0\" role=\"button\" aria-label=\"twitter\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M389.2 48h70.6L305.6 224.2 487 464H345L233.7 318.6 106.5 464H35.8L200.7 275.5 26.8 48H172.4L272.9 180.9 389.2 48zM364.4 421.8h39.1L151.1 88h-42L364.4 421.8z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-1d136f14\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.linkedin.com\/shareArticle?url=\" tabindex=\"0\" role=\"button\" aria-label=\"linkedin\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 448 512\"><path d=\"M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n<\/div>\n<\/div>\n<\/div><\/div>\n\n\n\n<p>Researchers at Florida Atlantic University (FAU) have achieved a significant breakthrough in enhancing communication accessibility for individuals who are deaf or hard-of-hearing. By leveraging the power of artificial intelligence, the research team has developed a pioneering system capable of interpreting American Sign Language (ASL) gestures in real-time.<\/p>\n\n\n\n<p>Bader Alsharif, the first author of the study and a doctoral candidate in FAU&#8217;s Department of Electrical Engineering and Computer Science, emphasized the innovation behind their approach. <\/p>\n\n\n\n<p>\u201cCombining MediaPipe and YOLOv8, along with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach,\u201d he said in a <a href=\"https:\/\/www.fau.edu\/newsdesk\/articles\/artificial-intelligence-sign-language-study.php\" target=\"_blank\" rel=\"noopener\" title=\"\">news release<\/a>. &#8220;This method hasn\u2019t been explored in previous research, making it a new and promising direction for future advancements.\u201d<\/p>\n\n\n\n<p>The study, <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2773186324000951\" target=\"_blank\" rel=\"noopener\" title=\"\">published<\/a> in the Elsevier journal Franklin Open, involved the creation of a custom dataset of 29,820 static images of ASL hand gestures. Each image was meticulously annotated with 21 key landmarks on the hand using MediaPipe, providing detailed spatial information about its structure and position. The integration of these annotations significantly enhanced the precision of YOLOv8, the deep learning model employed in the research.<\/p>\n\n\n\n<p>By leveraging this detailed hand pose information, the system achieved remarkable results. The model performed with an accuracy of 98%, correctly identified gestures (recall) at 98% and attained an overall performance score (F1 score) of 99%. It also recorded a mean Average Precision (mAP) of 98% and a more detailed mAP50-95 score of 93%, highlighting its high reliability and precision.<\/p>\n\n\n\n<p>\u201cOur research demonstrates the potential of combining advanced object detection algorithms with landmark tracking for real-time gesture recognition, offering a reliable solution for American Sign Language interpretation,\u201d added co-author Mohammad Ilyas,  a professor in FAU\u2019s Department of Electrical Engineering and Computer Science. &#8220;The success of this model is largely due to the careful integration of transfer learning, meticulous dataset creation and precise tuning of hyperparameters.&#8221;<\/p>\n\n\n\n<p>This innovative system represents a crucial leap forward in assistive technology, as it can significantly reduce communication barriers. The model&#8217;s ability to maintain high recognition rates even under varying hand positions underscores its adaptability in diverse settings.<\/p>\n\n\n\n<p>Looking to the future, the research team aims to expand the dataset to include a wider range of hand shapes and gestures, further enhancing the model&#8217;s accuracy. Additionally, optimizing the model for deployment on edge devices will ensure its real-time performance in resource-constrained environments, making it accessible for everyday use.<\/p>\n\n\n\n<p>Stella Batalama, dean of FAU\u2019s College of Engineering and Computer Science, highlighted the societal impact of this research. <\/p>\n\n\n\n<p>&#8220;By improving American Sign Language recognition, this work contributes to creating tools that can enhance communication for the deaf and hard-of-hearing community,&#8221; she said in the news release. &#8220;The model\u2019s ability to reliably interpret gestures opens the door to more inclusive solutions that support accessibility, making daily interactions &#8212; whether in education, healthcare, or social settings &#8212; more seamless and effective for individuals who rely on sign language. This progress holds great promise for fostering a more inclusive society where communication barriers are reduced.\u201d<\/p>\n\n\n\n<p>The study also involved contributions from Easa Alalwany, a recent doctoral graduate of FAU&#8217;s College of Engineering and Computer Science and an assistant professor at Taibah University in Saudi Arabia.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Researchers at Florida Atlantic University (FAU) have achieved a significant breakthrough in enhancing communication accessibility for individuals who are deaf or hard-of-hearing. By leveraging the power of artificial intelligence, the research team has developed a pioneering system capable of interpreting American Sign Language (ASL) gestures in real-time. Bader Alsharif, the first author of the study [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"single-no-separators","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[8],"tags":[],"class_list":["post-12377","post","type-post","status-publish","format-standard","hentry","category-ai"],"acf":[],"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"The University Network","author_link":"https:\/\/www.tun.com\/home\/author\/funky_junkie\/"},"uagb_comment_info":0,"uagb_excerpt":"Researchers at Florida Atlantic University (FAU) have achieved a significant breakthrough in enhancing communication accessibility for individuals who are deaf or hard-of-hearing. By leveraging the power of artificial intelligence, the research team has developed a pioneering system capable of interpreting American Sign Language (ASL) gestures in real-time. Bader Alsharif, the first author of the study&hellip;","_links":{"self":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/12377","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/comments?post=12377"}],"version-history":[{"count":9,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/12377\/revisions"}],"predecessor-version":[{"id":12421,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/12377\/revisions\/12421"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/media?parent=12377"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/categories?post=12377"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/tags?post=12377"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}