基于谱减法语音增强系统的设计与研究1.docVIP

  • 24
  • 0
  • 约2.91万字
  • 约 51页
  • 2021-10-10 发布于浙江
  • 举报

基于谱减法语音增强系统的设计与研究1.doc

基于谱减法语音增强系统的设计与研究1 语音增强系统的设计与研究 I 摘要 语音处理系统都不可避免地要受到各种噪声的干扰,噪声不但降低了语 音质量和语音的可懂度,而且还将导致系统性能的急剧恶化,严重时使整个 系统无法正常工作。为消除噪声干扰,语音处理系统广泛采用语音增强技术 来改善语音质量和可懂度,提高系统性能。因此,研究语音增强技术具有重 要意义。 一个完整合理的语音增强系统包括几个必要过程。首先需要对噪声进行 估计,将所得噪声估计应用于合适的增强算法中,增强算法是整个系统的核 心部分。最后根据不同的增强系统的要求对所得的结果进行后期处理。 语音增强一般都作为预处理或前端处理模块存在于语音处理系统中。由 于噪声的种类很多,因此针对各类噪声的语音增强的方法也有不同。本文首 先从语音特性、噪声特性、人耳的感知特性以及语音信号分析的方法入手, 概述并比较了语音增强常用的几种算法,然后重点研究了基于基本频谱相减 法的增强算法并针对其不足对其进行了一定的改进。最后,基于Matlab环 境分别对传统的谱减法和本文的改进算法在实际应用中进行了仿真对比实 验,验证了改进谱减算法的有效性和可行性。 论文的最后对论文所做的工作进行了总结。 :语音增强;短时幅度谱估计;谱减法;音乐噪声 I 目录 Abstract Speech processing systems are inevitably interfered by various noise. The noise not only degrades the quality and the intelligibility of the processing systems, seriously the system couldnt work well .To minimizing the effects of the noise on the performance of the processing systems, speech enhancement technology is applied in the various speech processing systems. Consequently the study of speech enhancement technology is very significant. A complete enhancement system has three parts:a noise estimator which is used to trace noise estimation,a noise reduction algorithm which is the crucial part in the system and a post-processor which is necessary if specific design is required. Speech enhancement are generally pre or front-end processing module as exists in the voice processing system. As many different types of noise, so all kinds of noise for speech enhancement methods are also different. Firstly, from the voice characteristics, noise characteristics, characteristics of the human ears perception and speech signal analysis methods start with an overview and comparison of several speech enhancement algorithms commonly used, and then focuses on the basic spectral subtraction based on the enhancement algorithm, and for its lack of its been some improvements. Finally, the Matlab environment was based on the traditional spectral subtraction and this improved algorithm in practical application s

文档评论(0)

1亿VIP精品文档

相关文档