VEU - Grup de Tractament de la Parla
http://hdl.handle.net/2117/3746
Tue, 03 May 2016 05:21:10 GMT2016-05-03T05:21:10ZLooking for efficient and accurate ways of computing the global ionospheric electron density distribution from huge amounts of GNSS observations
http://hdl.handle.net/2117/86453
Looking for efficient and accurate ways of computing the global ionospheric electron density distribution from huge amounts of GNSS observations
Hernández Pajares, Manuel; Juan Zornoza, José Miguel; Sanz Subirana, Jaume; Monte Moreno, Enrique; Aragón Ángel, María Ángeles
In this work the authors will explore different potential ways of estimating efficiently and accurately the global
number density of ionospheric free electrons from the most part of nowadays available GNSS measurements, taken from ground based GPS receivers (IGS network) and LEO on-board GPS receivers (such as FORMOSAT-3/COSMIC constellation).
It is basically designed as a bootstrapping approach, from a first determination of VTEC global maps based on
the ground data, to a final electron density extrapolation process aided by simple first-principle conditions, and
passing by an optimal error decorrelation treatment in the VTEC interpolation and corresponding application to
improve the inversion of the GPS occultation measurements.
The performances against external reference data, including dual frequency altimeters and ionosonde measurements, will be also shown to support the conclusions in different Solar Cycle conditions.
Mon, 02 May 2016 08:39:12 GMThttp://hdl.handle.net/2117/864532016-05-02T08:39:12ZHernández Pajares, ManuelJuan Zornoza, José MiguelSanz Subirana, JaumeMonte Moreno, EnriqueAragón Ángel, María ÁngelesIn this work the authors will explore different potential ways of estimating efficiently and accurately the global
number density of ionospheric free electrons from the most part of nowadays available GNSS measurements, taken from ground based GPS receivers (IGS network) and LEO on-board GPS receivers (such as FORMOSAT-3/COSMIC constellation).
It is basically designed as a bootstrapping approach, from a first determination of VTEC global maps based on
the ground data, to a final electron density extrapolation process aided by simple first-principle conditions, and
passing by an optimal error decorrelation treatment in the VTEC interpolation and corresponding application to
improve the inversion of the GPS occultation measurements.
The performances against external reference data, including dual frequency altimeters and ionosonde measurements, will be also shown to support the conclusions in different Solar Cycle conditions.Medium Rate Speech Coding with Vector Quantization
http://hdl.handle.net/2117/86212
Medium Rate Speech Coding with Vector Quantization
Masgrau Gómez, Enrique José; Mariño Acebal, José Bernardo; Moreno Bilbao, M. Asunción
Tue, 26 Apr 2016 15:40:05 GMThttp://hdl.handle.net/2117/862122016-04-26T15:40:05ZMasgrau Gómez, Enrique JoséMariño Acebal, José BernardoMoreno Bilbao, M. AsunciónAdaptive spectrum estimation with linear constrains
http://hdl.handle.net/2117/86201
Adaptive spectrum estimation with linear constrains
Vázquez Grau, Gregorio; Vallverdú Bayés, Francesc
A general constrained adaptive metbod is developed to be applied to the spectral estimation problem. The method presented can be used in a wide range of situatious, this is, we can get different estimators wíth it. The algorithm is formulated in a varíational approach context,and tbe non linear system obtained is solved with a constrained adaptive method applied to a digitized version of the spedrum. The set of constraínts is considered to be a set of known correlation values, and they can be located in non consecutíve lags. A generalization of the method is done, so it can be used in a rnu lt idimensional framework. As an example, a bidimensional ma.ximum entropy spectrum is presented.
Tue, 26 Apr 2016 13:05:44 GMThttp://hdl.handle.net/2117/862012016-04-26T13:05:44ZVázquez Grau, GregorioVallverdú Bayés, FrancescA general constrained adaptive metbod is developed to be applied to the spectral estimation problem. The method presented can be used in a wide range of situatious, this is, we can get different estimators wíth it. The algorithm is formulated in a varíational approach context,and tbe non linear system obtained is solved with a constrained adaptive method applied to a digitized version of the spedrum. The set of constraínts is considered to be a set of known correlation values, and they can be located in non consecutíve lags. A generalization of the method is done, so it can be used in a rnu lt idimensional framework. As an example, a bidimensional ma.ximum entropy spectrum is presented.Leveraging online user feedback to improve statistical machine translation
http://hdl.handle.net/2117/86200
Leveraging online user feedback to improve statistical machine translation
Formiga, Lluís; Barrón-Cedeño, Alberto; Marquez, Lluis; Henriquez, Carlos A; Mariño Acebal, José Bernardo
In this article we present a three-step methodology for dynamically improving a statistical machine translation (SMT) system by incorporating human feedback in the form of free edits on the system translations. We target at feedback provided by casual users, which is typically error-prone. Thus, we first propose a filtering step to automatically identify the better user-edited translations and discard the useless ones. A second step produces a pivot-based alignment between source and user-edited sentences, focusing on the errors made by the system. Finally, a third step produces a new translation model and combines it linearly with the one from the original system. We perform a thorough evaluation on a real-world dataset collected from the Reverso.net translation service and show that every step in our methodology contributes significantly to improve a general purpose SMT system. Interestingly, the quality improvement is not only due to the increase of lexical coverage, but to a better lexical selection, reordering, and morphology. Finally, we show the robustness of the methodology by applying it to a different scenario, in which the new examples come from an automatically Web-crawled parallel corpus. Using exactly the same architecture and models provides again a significant improvement of the translation quality of a general purpose baseline SMT system.
Tue, 26 Apr 2016 12:55:39 GMThttp://hdl.handle.net/2117/862002016-04-26T12:55:39ZFormiga, LluísBarrón-Cedeño, AlbertoMarquez, LluisHenriquez, Carlos AMariño Acebal, José BernardoIn this article we present a three-step methodology for dynamically improving a statistical machine translation (SMT) system by incorporating human feedback in the form of free edits on the system translations. We target at feedback provided by casual users, which is typically error-prone. Thus, we first propose a filtering step to automatically identify the better user-edited translations and discard the useless ones. A second step produces a pivot-based alignment between source and user-edited sentences, focusing on the errors made by the system. Finally, a third step produces a new translation model and combines it linearly with the one from the original system. We perform a thorough evaluation on a real-world dataset collected from the Reverso.net translation service and show that every step in our methodology contributes significantly to improve a general purpose SMT system. Interestingly, the quality improvement is not only due to the increase of lexical coverage, but to a better lexical selection, reordering, and morphology. Finally, we show the robustness of the methodology by applying it to a different scenario, in which the new examples come from an automatically Web-crawled parallel corpus. Using exactly the same architecture and models provides again a significant improvement of the translation quality of a general purpose baseline SMT system.On the Use of Higher Order Information in SVD Based Methods
http://hdl.handle.net/2117/86199
On the Use of Higher Order Information in SVD Based Methods
Vázquez Grau, Gregorio; Vallverdú Bayés, Francesc
Tue, 26 Apr 2016 12:50:19 GMThttp://hdl.handle.net/2117/861992016-04-26T12:50:19ZVázquez Grau, GregorioVallverdú Bayés, FrancescCross spectrum ML estimate
http://hdl.handle.net/2117/86195
Cross spectrum ML estimate
Lagunas Hernandez, Miguel A.; Santamaría Pérez, María Eugenia; Gasull Llampallas, Antoni; Moreno Bilbao, M. Asunción
This work reports how to include general concepts of the one-dimensional MLM procedure in a two-channel problem of cross-spectrum estimation. It is shown in the sequel that there is no any problem in extrapolating the well-known procedures for auto-spectrum estimation to the cross-spectrum, if the original procedure can be explained as a filter bank analysis procedure. The resulting cross-spectrum estimate looks formally to satisfy the excellent features which the normalized maximum likelihood procedure, reported previously by the authors, does in the auto-spectrum problem as concerns with resolution a low-side lobe behavior.
Tue, 26 Apr 2016 12:35:55 GMThttp://hdl.handle.net/2117/861952016-04-26T12:35:55ZLagunas Hernandez, Miguel A.Santamaría Pérez, María EugeniaGasull Llampallas, AntoniMoreno Bilbao, M. AsunciónThis work reports how to include general concepts of the one-dimensional MLM procedure in a two-channel problem of cross-spectrum estimation. It is shown in the sequel that there is no any problem in extrapolating the well-known procedures for auto-spectrum estimation to the cross-spectrum, if the original procedure can be explained as a filter bank analysis procedure. The resulting cross-spectrum estimate looks formally to satisfy the excellent features which the normalized maximum likelihood procedure, reported previously by the authors, does in the auto-spectrum problem as concerns with resolution a low-side lobe behavior.Método MLNq para arrays de alta resolución
http://hdl.handle.net/2117/86149
Método MLNq para arrays de alta resolución
Gasull Llampallas, Antoni; Lagunas Hernandez, Miguel A.; Fernández Rubio, Juan Antonio; Moreno Bilbao, M. Asunción
Spectral analysis techniques are used to bearing estimation problem. Each one of this gives a different array beamforming. We show here a generalized normalized Maximum Likehood Method which present a high resolution comparable to the singular value decomposition methods, but with a smaller computational load .
Mon, 25 Apr 2016 13:36:49 GMThttp://hdl.handle.net/2117/861492016-04-25T13:36:49ZGasull Llampallas, AntoniLagunas Hernandez, Miguel A.Fernández Rubio, Juan AntonioMoreno Bilbao, M. AsunciónSpectral analysis techniques are used to bearing estimation problem. Each one of this gives a different array beamforming. We show here a generalized normalized Maximum Likehood Method which present a high resolution comparable to the singular value decomposition methods, but with a smaller computational load .A direct approximation technique for designing digital equalizers with simultaneous specification of magnitude and phase
http://hdl.handle.net/2117/86137
A direct approximation technique for designing digital equalizers with simultaneous specification of magnitude and phase
Mariño Acebal, José Bernardo; Figueiras Vidal, Aníbal R.
A new direct method for designing nonrecursive and recursive digital equalizers approximating specified attenuation and phase responses is introduced . This method allows to control both magnitude and phase characteristics independently; to the authors' knowledge, this is the only direct method proposed in the literature which allows this independent control in nonrecursive designs without using sucessive equalization.
Mon, 25 Apr 2016 12:38:22 GMThttp://hdl.handle.net/2117/861372016-04-25T12:38:22ZMariño Acebal, José BernardoFigueiras Vidal, Aníbal R.A new direct method for designing nonrecursive and recursive digital equalizers approximating specified attenuation and phase responses is introduced . This method allows to control both magnitude and phase characteristics independently; to the authors' knowledge, this is the only direct method proposed in the literature which allows this independent control in nonrecursive designs without using sucessive equalization.Randomizing ties in a sign radar detector
http://hdl.handle.net/2117/86104
Randomizing ties in a sign radar detector
Figueiras Vidal, Aníbal R.; Mariño Acebal, José Bernardo; Lagunas Hernandez, Miguel A.; García Gómez, Ramón; Martín Funke, Enrique
A general formulation to consider the effects of typical randomization methods (RMs) for a digital application of Generalized Sign Test (GST) detector in Radar is introduced. A first approximation leads us to some basic restrictions to be imposed to RMs. Introducing them, when the approximation is acceptable, our formulation allows to evaluate easily the false alarm and detection probabilities (PFA and PD ) obtainable with the use of each RM in fuction of the quantizing step (q) of the video samples, and, then, to select the most appropriate among them. Besides this, by considering the values of PFA and PD with respect to continuous situations, we can determine the maximum q to obtain small enough variations due to quantization (which has parametric effects). In such a way, a maximum dynamic range and a basically nonparametric behaviour are achieved. An example illustrates the application of the theory.
Fri, 22 Apr 2016 13:44:15 GMThttp://hdl.handle.net/2117/861042016-04-22T13:44:15ZFigueiras Vidal, Aníbal R.Mariño Acebal, José BernardoLagunas Hernandez, Miguel A.García Gómez, RamónMartín Funke, EnriqueA general formulation to consider the effects of typical randomization methods (RMs) for a digital application of Generalized Sign Test (GST) detector in Radar is introduced. A first approximation leads us to some basic restrictions to be imposed to RMs. Introducing them, when the approximation is acceptable, our formulation allows to evaluate easily the false alarm and detection probabilities (PFA and PD ) obtainable with the use of each RM in fuction of the quantizing step (q) of the video samples, and, then, to select the most appropriate among them. Besides this, by considering the values of PFA and PD with respect to continuous situations, we can determine the maximum q to obtain small enough variations due to quantization (which has parametric effects). In such a way, a maximum dynamic range and a basically nonparametric behaviour are achieved. An example illustrates the application of the theory.Envelope and instantaneous phase considerations in speech modelling
http://hdl.handle.net/2117/86071
Envelope and instantaneous phase considerations in speech modelling
Moreno Bilbao, M. Asunción; Lagunas Hernandez, Miguel A.
The authors present a low-bit-rate coding system where the envelope and instantaneous phase of the residual are used. A time-varying filter (short-delay filter) is excited by a signal composed by a parametric version of the residual, multiplied by a sequence from a codebook. Two alternatives are studied for the design of the codebook: sequences formed by random pulses and sequences formed by random phases to simulate the instantaneous phase of the residual
Thu, 21 Apr 2016 14:14:43 GMThttp://hdl.handle.net/2117/860712016-04-21T14:14:43ZMoreno Bilbao, M. AsunciónLagunas Hernandez, Miguel A.The authors present a low-bit-rate coding system where the envelope and instantaneous phase of the residual are used. A time-varying filter (short-delay filter) is excited by a signal composed by a parametric version of the residual, multiplied by a sequence from a codebook. Two alternatives are studied for the design of the codebook: sequences formed by random pulses and sequences formed by random phases to simulate the instantaneous phase of the residual