ACL-OCL / Base_JSON /prefixO /json /O18 /O18-1026.json
Benjamin Aw
Add updated pkl file v3
6fa4bc9
{
"paper_id": "O18-1026",
"header": {
"generated_with": "S2ORC 1.0.0",
"date_generated": "2023-01-19T08:09:53.602919Z"
},
"title": "On Four Metaheuristic Applications to Speech Enhancement -Implementing Optimization Algorithms with MATLAB R2018a",
"authors": [
{
"first": "Su-Mei",
"middle": [],
"last": "Shiue",
"suffix": "",
"affiliation": {
"laboratory": "",
"institution": "National Taipei University of Technology",
"location": {}
},
"email": ""
},
{
"first": "Lang-Jyi",
"middle": [],
"last": "Huang",
"suffix": "",
"affiliation": {
"laboratory": "",
"institution": "Taipei University of Technology",
"location": {}
},
"email": ""
},
{
"first": "Wei-Ho",
"middle": [],
"last": "Tsai",
"suffix": "",
"affiliation": {
"laboratory": "",
"institution": "Taipei University of Technology",
"location": {}
},
"email": "whtsai@ntut.edu.tw"
},
{
"first": "Yen-Lin",
"middle": [],
"last": "Chen",
"suffix": "",
"affiliation": {
"laboratory": "",
"institution": "Taipei University of Technology",
"location": {}
},
"email": "ylchen@csie.ntut.edu.tw"
}
],
"year": "",
"venue": null,
"identifiers": {},
"abstract": "This study aims to firstly implement four well-known metaheuristic optimization algorithms which, among other things, are utilized on adaptive filter design, for dual-channel speech enhancement, with voice communication devices. The implementations are conducted in a simulation fashion using MATLAB code under its newly release version of R2018a. Lately, the study takes a closer look at these four optimization methods, based on learning from the literature of the research society, on their pros and cons while applying to speech enhancement. These four methods are, namely, (1) Accelerated Particle Swarm Optimization (APSO), (2) Gravitational Search Algorithm (GSA), (3) a hybrid algorithm of PSO (Particle Swarm Optimization) and GSA (called hybrid PSOGSA), and (4) Bat Algorithm (BA). This 266 study performs the said implementations successfully and obtains useful experimental results which contributes to building a solid foundation for more advanced research in the near future. Besides, the implementations made by the study confirm the correctness of many a previous research works which claim that these four algorithms show better performance on improved speech quality and intelligibility as compared to that of the existing standard PSO (SPSO) based speech enhancement approach.",
"pdf_parse": {
"paper_id": "O18-1026",
"_pdf_hash": "",
"abstract": [
{
"text": "This study aims to firstly implement four well-known metaheuristic optimization algorithms which, among other things, are utilized on adaptive filter design, for dual-channel speech enhancement, with voice communication devices. The implementations are conducted in a simulation fashion using MATLAB code under its newly release version of R2018a. Lately, the study takes a closer look at these four optimization methods, based on learning from the literature of the research society, on their pros and cons while applying to speech enhancement. These four methods are, namely, (1) Accelerated Particle Swarm Optimization (APSO), (2) Gravitational Search Algorithm (GSA), (3) a hybrid algorithm of PSO (Particle Swarm Optimization) and GSA (called hybrid PSOGSA), and (4) Bat Algorithm (BA). This 266 study performs the said implementations successfully and obtains useful experimental results which contributes to building a solid foundation for more advanced research in the near future. Besides, the implementations made by the study confirm the correctness of many a previous research works which claim that these four algorithms show better performance on improved speech quality and intelligibility as compared to that of the existing standard PSO (SPSO) based speech enhancement approach.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Abstract",
"sec_num": null
}
],
"body_text": [
{
"text": "The concise characteristics and advantages of metaheuristic optimization methods as discussed in this paper for speech enhancement are listed in Table 1 below. Therefore, PSO can be used in optimization problems that are partly irregular, variable over time, etc. [5] .",
"cite_spans": [
{
"start": 264,
"end": 267,
"text": "[5]",
"ref_id": "BIBREF4"
}
],
"ref_spans": [
{
"start": 145,
"end": 152,
"text": "Table 1",
"ref_id": "TABREF0"
}
],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "Simple conception, quick velocit significant SNR ratio for higher acc person will percept/ recognize the w GSA An algorithm inspired by Newton\"s laws of motion and gravitation force [1] .",
"cite_spans": [
{
"start": 182,
"end": 185,
"text": "[1]",
"ref_id": "BIBREF0"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "Updating is done by considering the performance than PSO [3] .",
"cite_spans": [
{
"start": 57,
"end": 60,
"text": "[3]",
"ref_id": "BIBREF2"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "A hybrid algorithm of PSO and GSA [3] . One of the most widely used swarm-intelligence-based algorithms owing to its simplicity and flexibility, particle swarm optimization, or PSO, was developed by Kennedy and Eberhart in 1995 [8] . The PSO algorithm searches the space of an objective function by adjusting the trajectories of individual agents, called particles, as the piecewise paths formed by positional vectors in a quasi-stochastic manner. The movement of a swarming particle is composed of two main components: a stochastic component and a deterministic component. Each particle is attracted toward the position of the current global best g * and its own best location in history, while at the same time it tends to move randomly.",
"cite_spans": [
{
"start": 34,
"end": 37,
"text": "[3]",
"ref_id": "BIBREF2"
},
{
"start": 228,
"end": 231,
"text": "[8]",
"ref_id": "BIBREF7"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "When a particle finds a location that is better than any previously found locations, it updates that location as the new current best for particle i. There is a current best for all n particles at any time t during iterations. The aim is to find the global best among all the current best solutions until the objective no longer improves or after a certain number of iterations. The movement of particles is schematically represented in Figure 1 , where (t)",
"cite_spans": [],
"ref_spans": [
{
"start": 437,
"end": 445,
"text": "Figure 1",
"ref_id": null
}
],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "is the current best for particle i , and g * \u2248 min{ f (xi )} for (i = 1, 2, . . . , n) is the current global best at t. moving toward the global best g * and the current best for each particle i.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "The essential steps of the particle swarm optimization are summarized as the pseudo code shown in Figure 2 . 2.2 Implement the Simulated PSO using MATLAB",
"cite_spans": [],
"ref_spans": [
{
"start": 98,
"end": 106,
"text": "Figure 2",
"ref_id": null
}
],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "The simulated PSO is implemented by MATLAB code [9] and displays the following output. 2.3 Accelerated PSO (APSO)",
"cite_spans": [
{
"start": 48,
"end": 51,
"text": "[9]",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "The standard particle swarm optimization (PSO) uses both the current global best g * and the individual best (t). However, there is no compelling reason for using the individual best unless the optimization problem of interest is highly nonlinear and multimodal.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "A simplified version that could accelerate the convergence of the algorithm is to use only the global best. This is the so-called accelerated particle swarm optimization (APSO) which was developed by Xin-She Yang in 2008 [7] . A further improvement to the accelerated PSO is to reduce the randomness as iterations proceed. That is to use a monotonically decreasing function such as (1) or 2where \u03b10 = 0.5 to 1 is the initial value of the randomness parameter. Here t is the number of iterations or time steps. is a control parameter.",
"cite_spans": [
{
"start": 221,
"end": 224,
"text": "[7]",
"ref_id": "BIBREF6"
},
{
"start": 382,
"end": 385,
"text": "(1)",
"ref_id": "BIBREF0"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "PSOGSA",
"sec_num": null
},
{
"text": "The implementation of the APSO was simulated using MATLAB code [10] . The total number of particles = 10 and that of iterations = 10. The 2D \"Michalewicz\" objective function is used. Two outputs are shown below. 3. Gravitational Search Algorithm (GSA) for Speech Enhancement",
"cite_spans": [
{
"start": 63,
"end": 67,
"text": "[10]",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "APSO implementation",
"sec_num": "2.4"
},
{
"text": "Gravitational search algorithm (GSA) is an optimization algorithm based on the law of gravity and mass interactions. This algorithm is based on the Newtonian gravity: \"Every particle in the universe attracts every other particle with a force that is directly proportional to the product of their masses and inversely proportional to the square of the distance between them\" [11] .",
"cite_spans": [
{
"start": 374,
"end": 378,
"text": "[11]",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Gravitational Search Algorithm (GSA)",
"sec_num": "3.1"
},
{
"text": "The implementation of the GSA was simulated using MATLAB code [11] . This experiment uses Function 1 to calculate the objective function. The output is shown below. The simulation results concluded that the performance of GSA algorithm is better when compared to SPSO with respect to the speech quality and intelligibility. ",
"cite_spans": [
{
"start": 62,
"end": 66,
"text": "[11]",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Gravitational Search Algorithm (GSA) Implementation",
"sec_num": "3.2"
},
{
"text": "The implementation of the GSA was simulated using MATLAB code [13] . This experiment uses Function 1 to calculate the objective function. The output is shown below. ",
"cite_spans": [
{
"start": 62,
"end": 66,
"text": "[13]",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "PSOGSA Implementation",
"sec_num": "4.2"
},
{
"text": "The original version of this algorithm is suitable for continuous problems, so it cannot be applied to binary problems directly. Here [14, 15] , a binary version of this algorithm is available. Bat algorithm (BA) is one of the recently proposed heuristic algorithms imitating the echolocation behavior of bats to perform global optimization. The superior performance of this algorithm has been proven among the other most well-known algorithms such as genetic algorithm (GA) and particle swarm optimization (PSO). This version of this algorithm can be applied to binary problems directly. This study run the BA simulation successfully which was executed under MATLAB R2018a and the simulation output is displayed as follows:",
"cite_spans": [
{
"start": 134,
"end": 138,
"text": "[14,",
"ref_id": null
},
{
"start": 139,
"end": 142,
"text": "15]",
"ref_id": "BIBREF14"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Bat Algorithm (BA) Implementation",
"sec_num": "5.2"
},
{
"text": "Since the implementations of the four metaheuristic optimization algorithms (APSO, GSA, PSOGSA, and BA) are well done, which provides a good base for more advanced study in the near future, such as to create a much powerful algorithm by making a variant of a current algorithm or combining multiple algorithms' advantages.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Conclusions",
"sec_num": "6."
},
{
"text": "Besides, with the work presented in [3] , it is proved that the performances of the proposed algorithms are compared with the performance of the existing standard PSO based speech enhancement approach. From the results it is observed that each of the proposed algorithms achieved better performance when compared with that of standard PSO based speech enhancement approach with improved speech quality and intelligibility scores.",
"cite_spans": [
{
"start": 36,
"end": 39,
"text": "[3]",
"ref_id": "BIBREF2"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Conclusions",
"sec_num": "6."
}
],
"back_matter": [],
"bib_entries": {
"BIBREF0": {
"ref_id": "b0",
"title": "Gravitational Search Algorithms in Data Mining: A Survey, IJARCCE ISSN (Online) 2278-1021 ISSN (Print) 2319 5940",
"authors": [
{
"first": "S",
"middle": [],
"last": "Poonam",
"suffix": ""
},
{
"first": "",
"middle": [],
"last": "Ratnoo",
"suffix": ""
}
],
"year": 2007,
"venue": "International Journal of Advanced Research in Computer and Communication Engineering ISO",
"volume": "3297",
"issue": "6",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Poonam, S. Ratnoo, Gravitational Search Algorithms in Data Mining: A Survey, IJARCCE ISSN (Online) 2278-1021 ISSN (Print) 2319 5940 International Journal of Advanced Research in Computer and Communication Engineering ISO 3297:2007 Certified Vol. 6, Issue 6, June 2017.",
"links": null
},
"BIBREF1": {
"ref_id": "b1",
"title": "A New Dual Channel Speech Enhancement Approach Based on Accelerated Particle Swarm Optimization (APSO), I.J. Intelligent Systems and Applications",
"authors": [
{
"first": "K",
"middle": [],
"last": "Prajna",
"suffix": ""
},
{
"first": "G",
"middle": [
"S B"
],
"last": "Rao",
"suffix": ""
},
{
"first": "K",
"middle": [
"V V S"
],
"last": "Reddy",
"suffix": ""
},
{
"first": "R",
"middle": [
"U"
],
"last": "Maheswari",
"suffix": ""
}
],
"year": 2014,
"venue": "MECS",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {
"DOI": [
"10.5815/ijisa.2014.04.01"
]
},
"num": null,
"urls": [],
"raw_text": "K. Prajna, G. S. B. Rao, K. V.V. S. Reddy, R. U. Maheswari, A New Dual Channel Speech Enhancement Approach Based on Accelerated Particle Swarm Optimization (APSO), I.J. Intelligent Systems and Applications, 2014, 04, 1-10 Published Online March 2014 in MECS (http://www.mecs-press.org/) DOI: 10.5815/ijisa.2014.04.01.",
"links": null
},
"BIBREF2": {
"ref_id": "b2",
"title": "Metaheuristic Applications to Speech Enhancement, Springer Briefs in Speech Technology",
"authors": [
{
"first": "P",
"middle": [],
"last": "Kunche",
"suffix": ""
},
{
"first": "K",
"middle": [
"V V S"
],
"last": "Reddy",
"suffix": ""
}
],
"year": 2016,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "P. Kunche and K.V.V.S. Reddy, Metaheuristic Applications to Speech Enhancement, Springer Briefs in Speech Technology, Springer International Publishing AG Switzerland, 2016.",
"links": null
},
"BIBREF3": {
"ref_id": "b3",
"title": "Bat Algorithm: Literature Review and Applications",
"authors": [
{
"first": "X.-S",
"middle": [],
"last": "Yang",
"suffix": ""
}
],
"year": 2013,
"venue": "J. Bio-Inspired Computation",
"volume": "5",
"issue": "3",
"pages": "141--149",
"other_ids": {
"DOI": [
"10.1504/IJBIC.2013.055093"
]
},
"num": null,
"urls": [],
"raw_text": "X.-S. Yang, Bat Algorithm: Literature Review and Applications, J. Bio-Inspired Computation, Vol. 5, No. 3, pp. 141-149 (2013). DOI: 10.1504/IJBIC.2013.055093.",
"links": null
},
"BIBREF4": {
"ref_id": "b4",
"title": "Speech Enhancement Analysis using PSO and ANFIS Methods",
"authors": [
{
"first": "M",
"middle": [
"P"
],
"last": "Kumar",
"suffix": ""
},
{
"first": "R",
"middle": [
"P"
],
"last": "Das",
"suffix": ""
}
],
"year": 2016,
"venue": "International Journal of Scientific Engineering and Technology Research",
"volume": "05",
"issue": "",
"pages": "10059--10065",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "M. P. KUMAR, R. P. DAS, Speech Enhancement Analysis using PSO and ANFIS Methods, International Journal of Scientific Engineering and Technology Research (IJSETR), ISSN 2319-8885 Vol.05, Issue.49, December-2016, Pages:10059-10065.",
"links": null
},
"BIBREF5": {
"ref_id": "b5",
"title": "Binary Bat Algorithm, Neural Computing and Applications",
"authors": [
{
"first": "S",
"middle": [],
"last": "Mirjalili",
"suffix": ""
},
{
"first": "S",
"middle": [
"M"
],
"last": "Mirjalili",
"suffix": ""
},
{
"first": "X",
"middle": [],
"last": "Yang",
"suffix": ""
}
],
"year": 2014,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {
"DOI": [
"10.1007/s00521-013-1525-5"
]
},
"num": null,
"urls": [],
"raw_text": "S. Mirjalili, S. M. Mirjalili, X. Yang, Binary Bat Algorithm, Neural Computing and Applications, In press, 2014, Springer DOI:http://dx.doi.org/10.1007/s00521-013-1525-5",
"links": null
},
"BIBREF6": {
"ref_id": "b6",
"title": "Nature-Inspired Optimization Algorithms",
"authors": [
{
"first": "X.-S",
"middle": [],
"last": "Yang",
"suffix": ""
}
],
"year": 2014,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "X.-S. Yang, Nature-Inspired Optimization Algorithms, Elsevier Inc., 2014.",
"links": null
},
"BIBREF7": {
"ref_id": "b7",
"title": "Particle swarm optimization",
"authors": [
{
"first": "J",
"middle": [],
"last": "Kennedy",
"suffix": ""
},
{
"first": "",
"middle": [],
"last": "Eberhart",
"suffix": ""
}
],
"year": 1995,
"venue": "Proceedings of the IEEE international conference on neural networks",
"volume": "",
"issue": "",
"pages": "1942--1990",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "J. Kennedy, RC Eberhart. Particle swarm optimization. In: Proceedings of the IEEE international conference on neural networks, Piscataway, NJ, USA; 1995. p. 1942-48.",
"links": null
},
"BIBREF11": {
"ref_id": "b11",
"title": "Paper: A New Hybrid PSOGSA Algorithm for Function Optimization",
"authors": [],
"year": 2010,
"venue": "IEEE International Conference on Computer and Information Application",
"volume": "",
"issue": "",
"pages": "374--377",
"other_ids": {
"DOI": [
"10.1109/ICCIA.2010.6141614"
]
},
"num": null,
"urls": [],
"raw_text": "Paper: A New Hybrid PSOGSA Algorithm for Function Optimization, in IEEE International Conference on Computer and Information Application(ICCIA 2010), China, 2010, pp.374-377, DOI: http://dx.doi.org/10.1109/ICCIA.2010.6141614",
"links": null
},
"BIBREF14": {
"ref_id": "b14",
"title": "Binary Bat Algorithm, Neural Computing and Applications",
"authors": [
{
"first": "S",
"middle": [],
"last": "Mirjalili",
"suffix": ""
},
{
"first": "S",
"middle": [
"M"
],
"last": "Mirjalili",
"suffix": ""
},
{
"first": "X.-S",
"middle": [],
"last": "Yang",
"suffix": ""
}
],
"year": 2014,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {
"DOI": [
"10.1007/s00521-013-1525-5"
]
},
"num": null,
"urls": [],
"raw_text": "S. Mirjalili, S. M. Mirjalili, X.-S. Yang, Binary Bat Algorithm, Neural Computing and Applications, In press, 2014, Springer DOI:http://dx.doi.org/10.1007/s00521-013-1525-5",
"links": null
}
},
"ref_entries": {
"FIGREF0": {
"text": "Schematic representation of the movement of a particle in PSO[7] Particle Swarm Optimization Pseudo code of Particle Swarm Optimization[7]",
"num": null,
"uris": null,
"type_str": "figure"
},
"FIGREF1": {
"text": "The PSO algorithm searches the space of the \"ackleysfcn\" objective function and reaches the convergence (generation = iteration)",
"num": null,
"uris": null,
"type_str": "figure"
},
"FIGREF2": {
"text": "Michalewicz objective function with the global optimality at (2.20319, 1.57049).",
"num": null,
"uris": null,
"type_str": "figure"
},
"FIGREF3": {
"text": "Initial and final locations of 20 particles after 10 iterations.2.5 APSO implementation results for speech enhancementThe proposed algorithm does not use individual best position for updating the new locations and is capable of finding the solution in minimum number of iterations. Further, the proposed APSO algorithm is converging faster compared to SPSO algorithm[3].",
"num": null,
"uris": null,
"type_str": "figure"
},
"FIGREF4": {
"text": "This implementation of GSA uses Function 1 to compute the value of objective function 3.3 GSA implementation results for speech enhancement[3]",
"num": null,
"uris": null,
"type_str": "figure"
},
"FIGREF5": {
"text": "This implementation of PSOGSA uses Function 23 to compute the value of objective function 4.3 PSOGSA Implementation Results for Speech Enhancement The SNR levels of the input noisy signal for babble-type noise is set at \u221210 ,\u22125, 0 and 5 dB. Results are averaged over 10 trail runs. The performance of the algorithms is evaluated by computing SNRI measure, and the results are tabulated. There is 10 dB improvement in PSOGSA when compared with that of SPSO algorithm. At all other noise input SNR levels of noisy speech (\u221210, 0 and 5 dB), there is almost 6 dB improvement in SNRI with PSOGSA when compared with that of SPSO. Throughout input SNR level, there is almost 2 dB improvement in PSOGSA when compared with that of conventional GSA. The performance of the algorithms is also evaluated by computing intelligibility objective measure FAI [3], 5. Bat Algorithm (BA) for Speech Enhancement 5.1 Bat Algorithm (BA)Bat algorithm (BA) is one of the recently proposed heuristic algorithms imitating the echolocation behavior of bats to perform global optimization. The superior performance of this algorithm has been proven among the other most well-known algorithms such as genetic algorithm (GA) and particle swarm optimization (PSO)[14,15].",
"num": null,
"uris": null,
"type_str": "figure"
},
"FIGREF6": {
"text": "Implementation of the Binary version BA 5.3 Bat Algorithm (BA) Implementation results for speech enhancement Other related simulation discussion on BA [3]: BA has yielded best solutions to the problem of adaptive filtering in speech enhancement by providing a dynamic balance between the exploration and exploitation of search space with an automatic zooming effect. With respect to quality and intelligibility of enhanced speech, simulation results proved that the proposed BA-based enhancement algorithm is superior to SPSO-based enhancement algorithm and all other algorithms proposed in this research work.",
"num": null,
"uris": null,
"type_str": "figure"
},
"TABREF0": {
"content": "<table/>",
"text": "Important metaheuristic optimization methods for speech enhancement Name Characteristics PSO PSO does not require that the problem of optimization to be differentiated as required by classical optimization methods such as gradient descent and quasi Newton methods.",
"html": null,
"num": null,
"type_str": "table"
},
"TABREF2": {
"content": "<table><tr><td>Table 3. Comparison of intelligibility objective measure FAI</td></tr><tr><td>4. Hybrid PSOGSA for Speech Enhancement</td></tr><tr><td>4.1 PSOGSA</td></tr></table>",
"text": "Comparison of quality objective measure PESQ And it can be concluded that PSOGSA can more effectively reduce the background noise of the noisy input speech[3]. The main idea of PSOGSA is to integrate the ability of exploitation in PSO with the ability of exploration in GSA to synthesize both algorithms' strength. Some benchmark test functions are used to compare the hybrid algorithm with both the standard PSO and GSA algorithms in evolving best solution[12].",
"html": null,
"num": null,
"type_str": "table"
}
}
}
}