06
3012405004818
Ciaco scrl
onixsuitesupport@onixsuite.com
20231206
eng
COM.ONIXSUITE.9782874630637
03
02
i6doc
01
SKU
75390
02
2874630632
03
9782874630637
15
9782874630637
10
BC
<TitleType>01</TitleType>
<TitleText>Thèses de l'Université catholique de Louvain (UCL)</TitleText>
Numéro 138
Thèses de l'École polytechnique de Louvain
138
<TitleType>01</TitleType>
<TitleText>Contrast properties of entropic criteria for blind source separation</TitleText>
<Subtitle>A unifying framework based on information-theoretic inequalities</Subtitle>
01
GCOI
28001100593060
1
A01
Frédéric Vrins
Vrins, Frédéric
Frédéric
Vrins
<p>Frédéric Vrins was born in Uccle (Belgium) in 1979. He received the MS degree in electromechanical engineering from the Université catholique de Louvain (Belgium) in 2002. Between 2002-2007, he served as an Assistant at the same university and worked towards the PhD degree within the UCL Machine Learning Group. He led the Association du Corps Scientifique en Sciences Appliquées (ACSSA) during 2006-2007. Dr Vrins is author or co-author of about 30 scientific papers published.</p>
1
01
eng
02
eng
306
00
306
03
TEC000000
29
2012
3069
TECHNIQUES ET SCIENCES APPLIQUEES
24
INTERNET
Sciences appliquées
01
06
01
<P>In the recent years, Independent Component Analysis has become a fundamental tool in signal and data processing, especially in the field of Blind Source Separation (BSS); under mild conditions, independent source signals can be recovered from mixtures of them by maximizing a so-called contrast function. Neither the mixing system nor the original sources are needed for that purpose, justifying the "blind" term. Among the existing BSS methods is the class of approaches maximizing Information-Theoretic Criteria (ITC), that rely on Rényi's entropies, including the well-known Shannon and Hartley entropies. These ITC are maximized via adaptive optimization schemes. Two major issues in this field are the following: i) Are ITC really contrast functions? and ii) As most of the algorithms look in fact for a local maximum point, what about the relevance of these local optima from the BSS point of view? Even though there are some partial answers to these questions in the literature, most of them are based on simulations and conjectures; formal developments are often lacking. This thesis aims at filling this lack as well as providing intuitive justifications, too. The BSS problem is stated in Chapter 1, and viewed under the information theory angle. The two next chapters address specifically the above questions: Chapter 2 discusses the contrast function property of ITC while the possible existence of spurious local maximum points in ITC is the purpose of Chapter 3. Finally, Chapter 4 deals with a range-based criterion, the only “entropy-based” contrast function which is discriminant, i.e. free from spurious local maxima. The interest of this approach is confirmed by testing the proposed technique on various examples, including the MLSP 2006 data analysis competition benchmark; our method outperforms the previously obtained results on large-scale and noisy mixture samples obtained through ill-conditioned mixing matrices.</p>
03
<P>In the recent years, Independent Component Analysis has become a fundamental tool in signal and data processing, especially in the field of Blind Source Separation (BSS); under mild conditions, independent source signals can be recovered from mixtures of them by maximizing a so-called contrast function. Neither the mixing system nor the original sources are needed for that purpose, justifying the "blind" term. Among the existing BSS methods is the class of approaches maximizing Information-Theoretic Criteria (ITC), that rely on Rényi's entropies, including the well-known Shannon and Hartley entropies. These ITC are maximized via adaptive optimization schemes. Two major issues in this field are the following: i) Are ITC really contrast functions? and ii) As most of the algorithms look in fact for a local maximum point, what about the relevance of these local optima from the BSS point of view? Even though there are some partial answers to these questions in the literature, most of them are based on simulations and conjectures; formal developments are often lacking. This thesis aims at filling this lack as well as providing intuitive justifications, too. The BSS problem is stated in Chapter 1, and viewed under the information theory angle. The two next chapters address specifically the above questions: Chapter 2 discusses the contrast function property of ITC while the possible existence of spurious local maximum points in ITC is the purpose of Chapter 3. Finally, Chapter 4 deals with a range-based criterion, the only “entropy-based” contrast function which is discriminant, i.e. free from spurious local maxima. The interest of this approach is confirmed by testing the proposed technique on various examples, including the MLSP 2006 data analysis competition benchmark; our method outperforms the previously obtained results on large-scale and noisy mixture samples obtained through ill-conditioned mixing matrices.</p>
02
In the recent years, Independent Component Analysis has become a fundamental tool in signal and data processing, especially in the field of Blind Source Separation (BSS); under mild conditions, independent source signals can be recovered from...
04
<p>Abstract xiii</p>
<p>Acknowledgments xv</p>
<p>Acronyms xvii</p>
<p>List of Notation xix</p>
<p>Introduction xxiii</p>
<p>1 BSS and its relationship to ICA 1</p>
<p>1.1 BSS: Motivation 2</p>
<p>1.2 ICA: an ecient tool for BSS 7</p>
<p>1.2.1 PD-equivalency and Non-mixing matrices 7</p>
<p>1.2.2 Independence and ICA 11</p>
<p>1.2.3 ICA and BSS 12</p>
<p>1.3 Independence measures 14</p>
<p>1.3.1 Divergence measures between densities 14</p>
<p>1.3.1.1 KL properties 15</p>
<p>1.3.1.2 From KL to mutual information 16</p>
<p>1.3.2 Other measures of independence 17</p>
<p>1.4 Extraction schemes and contrast function de nition 18</p>
<p>1.4.1 Extraction schemes 19</p>
<p>1.4.1.1 Simultaneous separation 19</p>
<p>1.4.1.2 Deation separation 19</p>
<p>1.4.1.3 Partial separation 19</p>
<p>1.4.2 Contrast functions 20</p>
<p>1.4.2.1 Simultaneous separation 20</p>
<p>1.4.2.2 Deation separation 20</p>
<p>1.4.2.3 Partial separation 21</p>
<p>1.5 Whitening preprocessing and geodesic search 22</p>
<p>1.5.1 Whitening 22</p>
<p>viii CONTENTS</p>
<p>1.5.2 Orthogonal contrast functions 24</p>
<p>1.5.3 Angular parametrization in the K=2 case 25</p>
<p>1.5.4 Manifold-constrained problem and geodesic optimization 25</p>
<p>1.6 Adaptive maximization of contrast functions 27</p>
<p>1.7 BSS and information measures 29</p>
<p>1.7.1 Information measure 30</p>
<p>1.7.1.1 Coding using Hartley's formula 30</p>
<p>1.7.1.2 Information and entropy 32</p>
<p>1.7.1.3 Extension to continuous random variables 33</p>
<p>1.7.1.4 Information gain and Mutual information 34</p>
<p>1.7.2 Entropy as a \complexity measure" 37</p>
<p>1.7.3 Generalized information measures 40</p>
<p>1.8 Issues and objectives of the Thesis 42</p>
<p>2 Contrast property of Entropic criteria 45</p>
<p>2.1 Some tools for building contrast functions 47</p>
<p>2.1.1 From orthogonal deation to orthogonal partial separation 47</p>
<p>2.1.2 Huber's superadditivity 49</p>
<p>2.2 Shannon's entropy-based contrast 51</p>
<p>2.2.1 Simultaneous approach 51</p>
<p>2.2.2 Deation approach 53</p>
<p>2.2.2.1 The contrast property 53</p>
<p>2.2.2.2 Non-mixing local maxima 57</p>
<p>2.2.3 Partial approach 58</p>
<p>2.3 Minimum range contrast 59</p>
<p>2.3.1 Support and Brunn-Minkowski Inequality 59</p>
<p>2.3.2 Properties of the range 60</p>
<p>2.3.3 Simultaneous approach 61</p>
<p>2.3.4 Deation approach 62</p>
<p>2.3.5 Partial approach 64</p>
<p>2.3.6 Support versus Range 65</p>
<p>2.3.7 A tool for building a D-BSS contrast based on Huber 65</p>
<p>2.4 Renyi's entropy contrast 66</p>
<p>2.4.1 Taylor development of Renyi's entropy 67</p>
<p>2.4.2 Deation approach 70</p>
<p>2.4.3 Simultaneous approach 71</p>
<p>2.4.4 Partial approach 72</p>
<p>2.4.5 Some counter-examples 72</p>
<p>2.4.5 Some counter-examples 72</p>
<p>2.5 Conclusion of the chapter 78</p>
<p>2.5.1 Summary of results 78</p>
<p>2.5.2 Use of Renyi entropies in blind separation/deconvolution 79</p>
<p>2.6 Appendix: Proofs of results of the Chapter 84</p>
<p>2.6.1 Proof of Corollary 6 (wording p. 50) 84</p>
<p>2.6.2 Proof of Property 4 (wording p. 57) 84</p>
<p>2.6.3 Proof of Theorem 11 (wording p. 58) 86</p>
<p>2.6.4 Proof of Lemma 6 (wording p. 60) 87</p>
<p>2.6.5 Proof of Theorem 14 (wording p. 62) 89</p>
<p>2.6.6 Proof of Theorem 15 (wording p. 63) 90</p>
<p>2.6.7 Proof of Theorem 16 (wording p. 64) 91</p>
<p>2.6.8 Convolution of Gaussian kernels (wording p. 67) 92</p>
<p>2.6.9 Proof of Lemma 7 (wording p. 71) 93</p>
<p>2.6.10 Proof of Lemma 8 (wording p. 71) 94</p>
<p>2.6.11 Proof of Lemma 9 (wording p. 72) 95</p>
<p>2.6.12 Proof of Lemma 10 (wording p. 72) 95</p>
<p>3 Discriminacy property of Entropic contrasts 97</p>
<p>3.1 Concept de nition and justi cation 99</p>
<p>3.2 Discriminacy of Shannon's entropy 100</p>
<p>3.2.1 Informal approach : the multimodal case 102</p>
<p>3.2.1.1 Location of the entropy minima 102</p>
<p>3.2.1.2 Modality and entropy minima 107</p>
<p>3.2.2 Formal analysis using a Taylor expansion 112</p>
<p>3.2.2.1 Simultaneous (mutual information) 112</p>
<p>3.2.2.2 Deation (negentropy) 119</p>
<p>3.2.3 Formal analysis using entropy approximation 120</p>
<p>3.2.3.1 Entropy bounds on multimodal pdf 121</p>
<p>3.2.3.2 Mixing local minima in multimodal BSS 125</p>
<p>3.2.3.3 Local minimum points of H(wU) 126</p>
<p>3.2.3.4 Local minimum points of h(wS) 129</p>
<p>3.2.3.5 Complementary observations 133</p>
<p>3.2.4 Cumulant-based versus Information-theoretic approaches 133</p>
<p>3.3 Discriminacy of Renyi's entropy 138</p>
<p>3.4 Discriminacy of the minimum range approach 139</p>
<p>3.4.1 Preliminaries : K = 2 case 139</p>
<p>3.4.2 Deation approach 140</p>
<p>3.4.2.1 Small variation analysis on manifolds 140</p>
<p>3.4.2.2 Piecewise g-convexity on hyper-sphere 143</p>
<p>3.4.2.3 Second order approach 149</p>
<p>3.4.3 Simultaneous approach 150</p>
<p>3.4.4 Partial approach 152</p>
<p>3.4.5 Givens trajectories and discriminacy 155</p>
<p>3.5 Discriminacy of the minimum support approach 160</p>
<p>3.6 Summary of results and contrast sets con guration 163</p>
<p>3.7 Conclusion of the chapter 164</p>
<p>3.7.1 Summary of results 164</p>
<p>3.7.2 Comparison with existing results 166</p>
<p>3.8 Appendix: Proofs of results of the Chapter 167</p>
<p>3.8.1 Proof of relation (3.14) (wording p. 114) 167</p>
<p>3.8.2 Proof of Lemma 11 (wording p. 116) 169</p>
<p>3.8.3 Proof of Lemma 12 (wording p. 116) 170</p>
<p>3.8.4 Proof of Lemma 13 (wording p. 121) 171</p>
<p>3.8.5 Proof of Lemma 14 (wording p. 126) 173</p>
<p>3.8.6 Proof of Lemma 15 (wording p. 130) 174</p>
<p>3.8.7 Proof of Lemma 16 (wording p. 140) 177</p>
<p>3.8.8 Proof of Theorem 18 (wording p. 141) 178</p>
<p>3.8.9 Proof of Lemma 17 (wording p. 149) 179</p>
<p>3.8.10 Proof of Lemma 19 (wording p. 151) 180</p>
<p>3.8.11 Proof of Corollary 15 (wording p. 152) 181</p>
<p>3.8.12 Proof of Lemma 20 (wording p. 152) 181</p>
<p>3.8.13 Proof of Lemma 21 (wording p. 152) 181</p>
<p>3.8.14 Proof of Lemma 23 (wording p. 154) 185</p>
<p>3.8.15 Proof of Corollary 18 (wording p. 155) 186</p>
<p>4 Minimum output range methods 187</p>
<p>4.1 Minimum range approach 188</p>
<p>4.1.1 Interpretation of the simultaneous approach 189</p>
<p>4.1.1.1 Interpretation in the mixture space 189</p>
<p>4.1.1.2 Interpretation in the output space 190</p>
<p>4.1.2 Interpretation of the deation approach 191</p>
<p>4.2 Range estimation 198</p>
<p>4.2.1 Some existing methods for endpoint estimation 198</p>
<p>4.2.2 Existing Range estimation 200</p>
<p>4.2.2.1 Support estimation via density estimation 202</p>
<p>4.2.2.2 Range estimation for BSS application 203</p>
<p>4.2.3 Quasi-range based approach 203</p>
<p>4.2.3.1 The observed range estimator 204</p>
<p>4.2.3.2 The m-averaged quasi-range estimator 208</p>
<p>4.2.3.3 Robustness of minimum-support ICA algorithms 213</p>
<p>4.3 Range minimization algorithm: SWICA 220</p>
<p>4.3.1 Algorithm 222</p>
<p>4.3.2 Performance analysis of SWICA for OS-based range estimators 224</p>
<p>4.4 Extensions of the minimum range 226</p>
<p>4.4.1 The problem of blind images separation : NOSWICA 227</p>
<p>4.4.1.1 SWICA for correlated images separation 227</p>
<p>4.4.1.2 NOSWICA: a non-orthogonal extension 232</p>
<p>4.4.2 Application to lower- or upper-bounded sources with possible in nite range 238</p>
<p>4.4.2.1 LABICA 238</p>
<p>4.4.2.2 Practical estimation 241</p>
<p>4.4.2.3 Optimization scheme for \hard" ICA problems 242</p>
<p>4.4.2.4 LABICA applied to MLSP'06 benchmark 246</p>
<p>4.5 Conclusion of the Chapter 250</p>
<p>4.6 Appendix 250</p>
<p>4.6.1 Proof of relation (4.52) 250</p>
<p>4.6.2 Expectation of the order statistics cdf di erences 251</p>
<p>4.6.3 Variance of the order statistics cdf di erences 253</p>
<p>Conclusion 255</p>
<p>Appendix A: Announcement of the IEEE MLSP 2006 data analysis competition 275</p>
<p>Appendix B: Author's publication list 279</p>
04
01
http://www.i6doc.com/resources/titles/28001100593060/images/3bcf6eecb2611212e088d0d91f2ade9c/THUMBNAIL/9782874630637.jpg
17
03
01
https://i6doc.com/resources/publishers/35.jpg
18
09
01
https://i6doc.com/resources/publishers/73.png
38
https://i6doc.com/en/book/?GCOI=28001100593060
06
3052405007518
Presses universitaires de Louvain
01
06
3052405007518
Presses universitaires de Louvain
01
Presses universitaires de Louvain
http://pul.uclouvain.be/
04
20070101
2007
02
WORLD
01
9.45
in
02
6.30
in
03
1.63
in
08
17.28
oz
01
24
cm
02
16
cm
03
1.63
cm
08
490
gr
06
3012405004818
CIACO - DUC
33
www.i6doc.com
http://www.i6doc.com
03
WORLD 01 2
20
1
02
00
02
02
STD
02
22.80
EUR
R
6.00
21.51
1.29