MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Recovery of Articulatory Movements from Acoustics with Phonemic Information

 Takesi Okadome, Shin Suzuki and Masaaki Honda
  
 

Abstract:

The method presented here determines articulatory parameters from acoustics in continuous speech on the basis of a search of an acoustic-articulatory codebook designed from the data obtained from experiments through a magnetic sensory system. The codebook consists of pairs of segmented acoustic and articulatory parameters which are, respectively, represented by 30 LPC cepstral coefficients and the Cartesian coordinate of nine points on the articulator. The time interval of a segment is 160 msec. Furthermore, the method can estimate articulatory parameters more precisely using the phonemic information in an utterance. It determines articulatory parameters by selecting the articulatory code vector in the codebook which minimizes the weighted distance measure of the segmental spectral the distance, the squared distance between succeeding articulatory parameters, and the squared distance between the position designated by the code vector and the position of the articulator predicted by the minimum-acceleration model for the phonemic sequence. Experiments were conducted to evaluate the efficiency of both constraints in determining articulatory parameters by comparing the estimated and the observed articulatory parameters. The results show that the rms error between the estimated and observed articulatory parameter was about 1.65 mm on average, and that the articulatory features for vowels and consonants are recovered well.

 
 


© 2010 The MIT Press
MIT Logo