Skip to content

In this paper, we propose Filter Gradient Decent (FGD), an efficient stochastic optimization algorithm that makes a consistent estimation of the local gradient by solving an adaptive filtering problem with different designs of filters.

Adamdad/Filter-Gradient-Decent

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Filter-Gradient-Decent

Update: This project also include the code for paper

Kalman Optimizer for Consistent Gradient Descent Xingyi Yang, (ICASSP2021) paper

Course project for ECE 251C UCSD. Code for paper,

Stochastic Gradient Variance Reduction by Solving a Filtering Problem

In this paper, we propose Filter Gradient Decent (FGD), an efficient stochastic optimization algorithm that make consistent estimation of the local gradient by solving an adaptive filtering problem with different design of filters.

Usage

  • To do later

About

In this paper, we propose Filter Gradient Decent (FGD), an efficient stochastic optimization algorithm that makes a consistent estimation of the local gradient by solving an adaptive filtering problem with different designs of filters.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages