We address the problem of filtering out localized time-frequency components in signals. The problem is formulated as a minimization of a suitable quadratic form, that involves a data fidelity term on the short-time Fourier transform outside the support of the undesired component, and an energy pe-nalization term inside the support. The minimization yields a linear system whose solution can be expressed in closed form using Gabor multipliers.
In this paper, we prove a convergence result for a discretization of the three-dimensional stationary compressible Navier-Stokes equations assuming an ideal gas pressure law p(ρ) = aρ^γ with γ > 3/2. It is the first convergence result for a numerical method with adiabatic exponents γ less than 3 when the space dimension is three. The considered numerical scheme combines finite volume techniques for the convection with the Crouzeix-Raviart finite element for the diffusion.
The propagation in tubes with varying cross section and wall visco-thermal effects is a classical problem in musical acoustics. To treat this aspect, the first method was the division in a large number of short cylinders. The division in short conical frustums with wall effects independent of the radius is better, but remains time consuming for narrow tubes and low frequencies. The use of the WKB method for the transfer matrix of a truncated cone without any division is investigated.
Underdetermined or ill-posed inverse problems require additional information for sound solutions with tractable optimization algorithms. Sparsity yields consequent heuristics to that matter, with numerous applications in signal restoration, image recovery, or machine learning. Since the l0 count measure is barely tractable, many statistical or learning approaches have invested in computable proxies, such as the l1 norm.
Underdetermined or ill-posed inverse problems require additional information for sound solutions with tractable optimization algorithms. Sparsity yields consequent heuristics to that matter, with numerous applications in signal restoration, image recovery, or machine learning. Since the $\ell_0$ count measure is barely tractable, many statistical or learning approaches have invested in computable proxies, such as the $\ell_1$ norm.
Spectral estimation generally aims at determining from a single realization of a given signal, the distribution of its power as a function of frequency. In this paper, we focus on multivariate, locally time-warped signals. We show that the spectral estimation problem can also be interpreted as a doubly nonstationary blind source separation (BSS) problem, where both the mixing matrix and the original sources contribute to nonstationarity.
Purpose of review: Highly coordinated cellular interactions occur in the healthy or pathologic adult rodent central nervous system (CNS). Until recently, technical challenges have restricted the analysis of these events to largely static modes of study such as immuno-fluorescence and electron microscopy on fixed tissues. The development of intravital imaging with subcellular resolution is required to probe the dynamics of these events in their natural context, the living brain.
Flexible and stretchable photonics are emerging fields aiming to develop novel applications where the devices need to conform to uneven surfaces or whenever lightness and reduced thickness are major requirements. However, owing to the relatively small refractive index of transparent soft matter including most polymers, these materials are not well adapted for light management at visible and near-infrared frequencies.
De nos jours, les systèmes de gestion et de traitement des données sont censés stocker et traiter de grandes séries temporelles. Comme le nombre de variables observées et liées augmente, leur prédiction est devenue de plus en plus compli- quée, et l’utilisation de toutes les variables pose des problèmes pour les modèles de prédiction classiques.
Handling time series forecasting with many predictors is a popular topic in the era of "Big data", where wast amounts of observed variables are stored and used in analytic processes. Classical prediction models face some limitations when applied to large-scale data. Using all the existing predictors increases the computational time and does not necessarily improve the forecast accuracy.