项目作者: vbisin
项目描述 :
Python code for Vittorio Bisin's Master's Thesis from the Courant Institute of Mathematical Sciences: 'A Study in Total Variation Denoising and its Validity in a Recently Proposed Denoising Algorithm'
高级语言: Python
项目地址: git://github.com/vbisin/Applied-Mathematics-Thesis.git
A Study in Total Variation Denoising and its Validity in a Recently Proposed Denoising Algorithm
driver.py
The driver script for the algorithm. Outputs graphs and percent successfully denoised for learned model on train and test sets.
createStep.py
- createSteps- Creates step (i.e. piece-wise constant) signals, where each function has a number of jumps equal to a multiple of the original signal’s length. Gaussian noise is then added to each signal.
- speedCreateSteps - Creates step function (without noise) for each sample
sgd.py
- multiSGDthres - Runs the stochastic gradient descent to minimize the the kernel (W) and/or the radial basis function parameter (alpha).
- SGDSample - Updates alpha and W values after computing gradient per sample.
batchSGD.py
- batchSGD - runs the mini-batch stochastic gradient descent algorithm to minimize the the kernel (W) and/or the radial basis function parameter (alpha).
- batchSGDsample - Updates alpha and W values after computing gradient per batch.
rbf.py
- rbfF - Calculates the Gaussian Radial Basis Functions (before being multiplied by the minimized parameter, alpha).
rbfCenters.py
- GRBFCenters- Calculates the equally spaced centers of the Gaussian RBF’s, given the ranges of the noisy signals under the linear transform W.
estimateSignal.py
Computes the model’s predicted signal.
gradients.py
- dFdGamma - calculates gradient of objective function w.r.t. variables inside the norm.
- alphaGradient - calculates gradient of objective function w.r.t. alpha.
- WGradient - calculates gradient of objective function w.r.t. the kernel (W).
dGammadW.py
- dGammadW - Returns derivative of gamma (see gradients appendix in writeup) w.r.t the kernel (W).
- dGammadWEntry - returns an entry of the above, dGammadW.
Armijo.py
- armijoAlpha - Estimates the optimal step size for the alpha gradient using the Armijo Rule.
- armijoW - Estimates the optimal step size for the W gradient using the Armijo Rule.
gradientCheck.py
- alphaGradCheck - Checks if the gradient wrt alpha is correct.
- speedAlphaGradCheck - Applies the limit derivative definition for each entry in alpha.
- WGradCheck - Checks if the gradient wrt W is correct.
- speedWGradCheck - Applies the limit derivative definition for each entry in the kernel, W.
exponentGradCheck.py
- exponentGradCheck - Checks the dGamma/dW derivative referred to in the latex (involved in computing the W gradient)
- speedExponentGradCheck - Applies the limit derivative definition to each entry of this derivative.