From 036b9390ccebb3e79da15f13796a9028dd31b49c Mon Sep 17 00:00:00 2001 From: amit Date: Tue, 28 Apr 2026 09:21:40 +0000 Subject: [PATCH 1/4] Add SL-FE phonological framework reference (SignLang 2024) Adds Sahin & Gokgoz (2024) to the Phonology section, describing the SL-FE framework that derives continuous phonological feature signals from pose estimation for automated annotation and quantitative phonological analysis on TID. Co-Authored-By: Claude Opus 4.7 (1M context) --- src/index.md | 1 + src/references.bib | 45 +++++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 46 insertions(+) diff --git a/src/index.md b/src/index.md index cca5e2f..c621604 100644 --- a/src/index.md +++ b/src/index.md @@ -110,6 +110,7 @@ palm orientation, placement, contact, path movement, local movement, as well as non-manual features including eye aperture, head movement, and torso positioning [@liddell1989american;@johnson2011toward;@brentari2011sign;@sandler2012phonological]. Not all possible phonemes are realized in both signed and spoken languages, and inventories of two languages' phonemes/features may not overlap completely. Different languages are also subject to rules for the allowed combinations of features. +@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals (finger selection, orientation, location, and movement) from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical claims such as feature dominance and symmetry on Turkish Sign Language (TID). ###### Simultaneity {-} Though an ASL sign takes about twice as long to produce than an English word, diff --git a/src/references.bib b/src/references.bib index 99a7716..9c27a2f 100644 --- a/src/references.bib +++ b/src/references.bib @@ -4788,3 +4788,48 @@ @inproceedings{martinez-guevara-curiel-2024-quantitative url = {https://aclanthology.org/2024.signlang-1.25}, year = {2024} } +} + + + title = "Quantitative Analysis of Hand Locations in both Sign Language and Non-linguistic Gesture Videos", + author = "Mart{\'i}nez-Guevara, Niels and + Curiel, Arturo", +} + +@inproceedings{otterspeer-etal-2024-signcollect, + title = "{S}ign{C}ollect: A `Touchless' Pipeline for Constructing Large-scale Sign Language Repositories", + author = "Otterspeer, Gom{\`e}r and + Klomp, Ulrika and + Roelofsen, Floris", +} + +@inproceedings{sahin-gokgoz-2024-decoding, + title = "Decoding Sign Languages: The {SL}-{FE} Framework for Phonological Analysis and Automated Annotation", + author = {{\c{S}}ahin, Karahan and + G{\"o}kg{\"o}z, Kadir}, + editor = "Efthimiou, Eleni and + Fotinea, Stavroula-Evita and + Hanke, Thomas and + Hochgesang, Julie A. and + Mesch, Johanna and + Schulder, Marc", + booktitle = "Proceedings of the LREC-COLING 2024 11th Workshop on the Representation and Processing of Sign Languages: Evaluation of Sign Language Resources", + month = may, + year = "2024", + address = "Torino, Italia", + publisher = "ELRA and ICCL", + url = "https://aclanthology.org/2024.signlang-1.22/", + pages = "204--212" +} + + url = "https://aclanthology.org/2024.signlang-1.25/", + pages = "225--234" +} + + url = "https://aclanthology.org/2024.signlang-1.30/", + pages = "269--275" +} + + url = "https://aclanthology.org/2024.signlang-1.37/", + pages = "335--342" +} From 99372f2010fcc9b10e3276bf878b01b1ed4e26f4 Mon Sep 17 00:00:00 2001 From: AmitMY Date: Tue, 28 Apr 2026 09:56:03 +0000 Subject: [PATCH 2/4] Trim sahin-gokgoz one-liner (review pattern: concise) --- src/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/index.md b/src/index.md index c621604..f0daa93 100644 --- a/src/index.md +++ b/src/index.md @@ -110,7 +110,7 @@ palm orientation, placement, contact, path movement, local movement, as well as non-manual features including eye aperture, head movement, and torso positioning [@liddell1989american;@johnson2011toward;@brentari2011sign;@sandler2012phonological]. Not all possible phonemes are realized in both signed and spoken languages, and inventories of two languages' phonemes/features may not overlap completely. Different languages are also subject to rules for the allowed combinations of features. -@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals (finger selection, orientation, location, and movement) from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical claims such as feature dominance and symmetry on Turkish Sign Language (TID). +@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical phonological claims on Turkish Sign Language (TID). ###### Simultaneity {-} Though an ASL sign takes about twice as long to produce than an English word, From f7d5bfeb476b568fc598a14cc401a91f027fac94 Mon Sep 17 00:00:00 2001 From: AmitMY Date: Tue, 28 Apr 2026 11:49:49 +0000 Subject: [PATCH 3/4] Move sahin-gokgoz entry to Automating Annotation section (review feedback) --- src/index.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/src/index.md b/src/index.md index f0daa93..933b998 100644 --- a/src/index.md +++ b/src/index.md @@ -110,7 +110,6 @@ palm orientation, placement, contact, path movement, local movement, as well as non-manual features including eye aperture, head movement, and torso positioning [@liddell1989american;@johnson2011toward;@brentari2011sign;@sandler2012phonological]. Not all possible phonemes are realized in both signed and spoken languages, and inventories of two languages' phonemes/features may not overlap completely. Different languages are also subject to rules for the allowed combinations of features. -@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical phonological claims on Turkish Sign Language (TID). ###### Simultaneity {-} Though an ASL sign takes about twice as long to produce than an English word, @@ -1220,6 +1219,8 @@ Therefore, data collection often requires significant efforts and costs of on-si One helpful research direction for collecting more data that enables the development of deployable SLP models is creating tools that can simplify or automate parts of the collection and annotation process. One of the most significant bottlenecks in obtaining more adequate signed language data is the time and scarcity of experts required to perform annotation. Therefore, tools that perform automatic parsing, detection of frame boundaries, extraction of articulatory features, suggestions for lexical annotations, and allow parts of the annotation process to be crowdsourced to non-experts, to name a few, have a high potential to facilitate and accelerate the availability of good data. Targeting prosodic non-manual annotation specifically, @susman-kimmelman-2024-eye trained a CNN classifier of eye openness (open, in-between, closed) on French Sign Language data and combined it with rule-based temporal aggregation to detect linguistically defined eye blinks, outperforming an Eye Aspect Ratio (EAR) baseline computed from MediaPipe landmarks. +@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical phonological claims on Turkish Sign Language (TID). + ### Practice Deaf Collaboration Finally, when working with signed languages, it is vital to keep in mind \emph{who} this technology should benefit and \emph{what} they need. From 0e9a70ee2a8d41370527b15737e1dc9d070437d4 Mon Sep 17 00:00:00 2001 From: AmitMY Date: Tue, 28 Apr 2026 12:41:33 +0000 Subject: [PATCH 4/4] Move sahin-gokgoz entry to Linguistic Analysis section (review feedback) --- src/index.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/index.md b/src/index.md index 933b998..2b9e3fb 100644 --- a/src/index.md +++ b/src/index.md @@ -1093,6 +1093,8 @@ This sub-area covers experimental studies of sign language structure that use co @martinez-guevara-curiel-2024-quantitative analyse hand locations across BSL, NGT, and LSM and find that signers organize the signing space into a Zipfian spatial hierarchy, concentrating articulation in cohesive regions more systematically than non-linguistic gesturers. +@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical phonological claims on Turkish Sign Language (TID). + ## Annotation Tools ##### ELAN - EUDICO Linguistic Annotator @@ -1219,8 +1221,6 @@ Therefore, data collection often requires significant efforts and costs of on-si One helpful research direction for collecting more data that enables the development of deployable SLP models is creating tools that can simplify or automate parts of the collection and annotation process. One of the most significant bottlenecks in obtaining more adequate signed language data is the time and scarcity of experts required to perform annotation. Therefore, tools that perform automatic parsing, detection of frame boundaries, extraction of articulatory features, suggestions for lexical annotations, and allow parts of the annotation process to be crowdsourced to non-experts, to name a few, have a high potential to facilitate and accelerate the availability of good data. Targeting prosodic non-manual annotation specifically, @susman-kimmelman-2024-eye trained a CNN classifier of eye openness (open, in-between, closed) on French Sign Language data and combined it with rule-based temporal aggregation to detect linguistically defined eye blinks, outperforming an Eye Aspect Ratio (EAR) baseline computed from MediaPipe landmarks. -@sahin-gokgoz-2024-decoding introduce SL-FE, a framework that derives continuous phonological feature signals from pose estimation, enabling automated ELAN annotation as well as quantitative tests of theoretical phonological claims on Turkish Sign Language (TID). - ### Practice Deaf Collaboration Finally, when working with signed languages, it is vital to keep in mind \emph{who} this technology should benefit and \emph{what} they need.