American Sign Language Linguistic Research Project Reports

Below you will find abstracts of project reports and information about obtaining copies. These reports are available in portable document format (pdf) for downloading.

Report No. 26   Carol Neidle  (2024 - forthcoming) - [pdf coming soon]

What's New in SignStream® 3.5 ?

NEW:  Report No. 25 Carol Neidle and Augustine Opoku (2024) - [pdf- 7.5  MB]       DOI: 10.13140/RG.2.2.25217.06247

A Guide to the ASLLRP Sign Bank – New Search Features  

Report No. 24   Carol Neidle and Augustine Opoku (2023) - [pdf]

Documentation for Download of ASLLRP Signs Segmented from Continuous Signing Corpora

Report No. 23   Carol Neidle (2023) - [pdf]

What's New in SignStream® 3.4.1?

Report No. 22   Carol Neidle (2022) [pdf - 1.6 MB]

What's New in SignStream® 3.4.0?

BU Open Access: https://open.bu.edu/handle/2144/45443

Report No. 21    Neidle, C. and C. Ballard (2022) [pdf]

Why Alternative Gloss Labels Will Increase the Value of the WLASL Dataset

See also: C. Neidle and C. Ballard, "Revised Gloss Labels for Signs from the WLASL Database: Preliminary Version." [pdf]

Report No. 20    Neidle, C. and A. Opoku (2022) [pdf]

Documentation for Download of ASLLRP Sign Bank Citation-Form Sign Datasets

October 2023:  Addendum 

Report No. 19    Neidle, C. and A. Opoku (2021) [pdf - 6.2 MB]

Update on Linguistically Annotated ASL Video Data Available through the American Sign Language Linguistic Research Project (ASLLRP)

BU Open Access: https://open.bu.edu/handle/2144/45756

Report No. 18    Neidle, C. and A. Opoku (2020) [pdf - 11 MB]

A User's Guide to the American Sign Language Linguistic Research Project (ASLLRP) Data Access Interface (DAI) 2 — Version 2

with an addendum about newly available data as of March 2022, available from:
http://www.bu.edu/asllrp/about-datasets.pdf

and information about new features as of May 2021 — including ASLLRP Sign Bank Search by Related English Words and Display Subsets of Signs by Sign Type — see: http://www.bu.edu/asllrp/New-features-DAI2.pdf

BU Open Access: https://open.bu.edu/handle/2144/45444

Report No. 17    Neidle, C. (2020) [pdf - 14 MB]

What's New in SignStream® 3.3.0?

BU Open Access: https://open.bu.edu/handle/2144/45757

Report No. 16    Neidle, C. (May 2018) [pdf - 14 MB]

What's New in SignStream® 3.1.0?

Report No. 15    Neidle, C. (August 2017) [pdf - 24 MB]

A User's Guide to SignStream® 3

BU Open Access: https://open.bu.edu/handle/2144/45758

Report No. 14    Duffy, Q. (August 2007)

The ASL Perfect Formed by Preverbal FINISH [pdf - 2.1 MB]

Copy of manuscript submitted in partial fulfillment of requirements for M.A. degree in Applied Linguistics at Boston University

Report No. 13    Neidle, C. (August 2007)

SignStream Annotation: Addendum to Conventions used for the American Sign Language Linguistic Research Project [pdf - 3 MB]

Additions to Report 11. Combined, these document ASLLRP Annotation Schema 3.0, which is the basis for the data released in 2007.

Report No. 12    Neidle, C. and R. G. Lee (July 2005)

The Syntactic Organization of American Sign Language: A Synopsis [pdf - 1 MB]

This report is a very concise overview of some of our recent findings concerning the syntactic organization of ASL.

Report No. 11    Neidle, C. (August 2002).

SignStream™ Annotation: Conventions used for the American Sign Language Linguistic Research Project [pdf - 8.6 MB]

This report is intended to assist in interpreting the annotations contained in the coded data we distribute. It is also intended to assist those who wish to adopt these conventions.

The notations that we have used are explained and, in many cases, illustrated with pictures. We discuss the considerations that led us to make particular decisions about annotation conventions.

BU Open Access: https://open.bu.edu/handle/2144/45760

Report No. 10    Neidle, C. (2000).

SignStream™: A Database Tool for Research on Visual-Gestural Language [pdf]

This document provides a brief summary of the features of SignStream version 2.0. The current status of the project and plans for future development are also addressed.

BU Open Access: https://open.bu.edu/handle/2144/45761

Report No. 9    MacLaughlin, D., C. Neidle, and D. Greenfield (2000).

SignStream™ User's Guide, Version 2.0 [pdf - 2.5 MB]

The first part of the guide functions as a tutorial, interactively demonstrating the main capabilities of SignStream, using the sample database that is provided with the application. The second part of the guide provides details about various aspects of the program. The guide concludes with several appendices containing reference information.

Report No. 8    MacLaughlin, D., C. Neidle, and D. Greenfield (1999).

SignStream™ User's Guide, Version 1.5 [pdf 2.3 MB]

Please note that Report 9 contains the current SignStream™ User's Guide.

Report No. 6    Neidle, C., D. MacLaughlin, R.G. Lee, B. Bahan, and J. Kegl (1998).

Wh-Questions in ASL: A Case for Rightward Movement.
[report 6 download page (with links to pdf and QuickTime files)]

This paper presents an analysis of wh-movement in American Sign Language in which moved wh-phrases occur in a rightward specifier of CP position. Evidence is based on straightforward word order facts and on the distribution of non-manual wh-marking, which displays the same patterns and systematicity as other non-manual syntactic markings. We address recent criticisms by Petronio and Lillo-Martin (1997). We show that their alternative interpretations of the data are incorrect, and that their analysis cannot account for the facts of the language. Thus, we maintain that universal grammar must allow the option of rightward movement.

Report No. 5    Neidle, C., D. MacLaughlin, B. Bahan, R.G. Lee, and J. Kegl (1997).

The SignStream™ Project [pdf]

SignStream is a database tool for analysis of linguistic data captured on video, currently under development by researchers at Boston University, Dartmouth College, Gallaudet University, and Rutgers University. The program is being designed specifically to assist in linguistic research on American Sign Language; however, it will provide sufficient flexibility for use in a wide variety of other applications. SignStream provides a single computing environment for manipulating video and linking specific frame sequences to simultaneously occurring linguistic events encoded in a fine-grained, multi-level transcription. Not only does SignStream greatly enhance the transcription process, but it enables the researcher to perform linguistic analyses of various kinds. By providing sophisticated search capabilities, SignStream will afford instant access to data.

Report No. 4    Neidle, C., D. MacLaughlin, and R.G. Lee, eds. (1997).

Syntactic Structure and Discourse Function: An Examination of Two Constructions in American Sign Language [pdf]

This volume brings together two papers that address the relation between syntactic structure and discourse function. Each paper analyzes a specific construction in ASL, proposing that the construction in question consists of two separate syntactic units that function together for a specific discourse purpose.

The paper by Hoza, Neidle, MacLaughlin, Kegl, and Bahan, A Unified Syntactic Account of Rhetorical Questions in American Sign Language, argues that rhetorical question-answer sequences are, in fact, syntactically composed of a question followed by an answer (in keeping with the way this construction had traditionally been described by ASL researchers, but in opposition to recent proposals that a wh-rhetorical question and the answer to it are contained syntactically within a single clause). The authors provide a unified analysis for both yes-no and wh-rhetorical question-answer sequences.

The paper by Lee, Neidle, MacLaughlin, Bahan, and Kegl, Role Shift in ASL: A Syntactic Look at Direct Speech, examines another construction used for a particular discourse function that has been analyzed in terms of syntactic embedding. The authors argue that concatenation of separate clauses is involved in the direct speech construction containing a verb of saying.

Both papers employ a series of tests to determine syntactic clause boundaries. By testing the predictions of the various competing proposals, the authors demonstrate that in neither case is there the kind of complex embedding that had been claimed. It is interesting that ASL seems to rely somewhat less on embedding than many languages. While there are clearly embedded structures in the language, neither rhetorical question-answer sequences nor direct speech constructions employ such a structure.

Report No. 2    Neidle, C., D. MacLaughlin, J. Kegl, and B. Bahan (1996).

Non-Manual Correlates of Syntactic Agreement in American Sign Language.
[report 2 download page (with links to pdf and QuickTime files]

This article is intended to illustrate how language research that is based on videotaped data can be distributed in conjunction with the actual data. The article is in pdf (portable document format). It can be read using the Adobe Acrobat™ Reader, version 2.1 (or greater), which is available free of charge from Adobe ( http://www.adobe.com/acrobat). The Movie plug-in for Acrobat Reader is also required (and available from Adobe at no charge). These files may be read from platforms that support QuickTime™ capability. For Macintosh computers, QuickTime version 2.0 or greater is required. For Windows, Apple QuickTime 2.0 or later or Microsoft Video for Windows is required. The QuickTime software may be obtained from http://quicktime.apple.com.

Building upon previous findings as to the syntactic distribution of grammatical markers associated with negation, wh-questions, and yes/no questions, we extend our analysis to agreement phenomena within DP and IP, and to the non-manual expression of person features (which, in ASL, are associated with positions in the signing space). We present videotaped examples showing the non-manual correlates of subject and object agreement within IP. We argue that, in transitive constructions, subject agreement is generally manifested by head tilt toward the position in space associated with the subject, while object agreement is expressed by eye gaze toward the position associated with the object. Both of these markings exhibit the predicted distribution, spreading over the following VP. Our analysis of these markings as reflexes of syntactic agreement receives support from the fact that they can license null arguments.

We then show that head tilt and eye gaze also express agreement relations internal to DP. For example, in DP's that include a possessor, the head may tilt toward the location associated with the possessor, while the eyes may gaze in the direction associated with the main NP. We show that there are parallels between transitive clauses and DP's with possessors, and also between in transitive clauses and DP's without possessors, with respect to the realization of agreement (similar to what has been found for other languages). The data from ASL support the idea that the AGR heads of extended nominal and verbal projections exhibit similar syntactic properties.


Computer Science Technical Reports

Castelli, T. J., M. Betke, and C. Neidle (July 7, 2005).
Facial Feature Tracking and Occlusion Recovery in American Sign Language
Boston University Computer Science Technical Report No. 2005-024

[Abstract]   [pdf]