Skip to content

This is a repository associated with the paper "CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep Learning" by Ben Adcock, Juan M. Cardenas, and Nick Dexter submitted at Sampling Theory and Applications (SampTa), also available at https://arxiv.org/abs/2208.12190

License

JMcardenas/CAS4DL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAS4DL

This is a repository associated with the paper "" by Ben Adcock, Juan M. Cardenas, and Nick Dexter

CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep Learning by Ben Adcock, Juan M. Cardenas and Nick Dexter.

to be published by SAMPTA in late 2022, available at https://arxiv.org/abs/2208.12190

If you have questions or comments about the code, please contact ben_adcock@sfu.ca, jcardena@sfu.ca, nicholas_dexter@sfu.ca.

Parts of this repository are based on the code for the paper "The gap between theory and practice in function approximation with deep neural networks" by Ben Adcock and Nick Dexter, which is available in the repository https://github.com/ndexter/MLFA

Code organization

Files are organized into four main directories:

src

Contains the main Matlab files used to create figures

Organized in Figures

utils

Contains various Matlab functions needed across main scripts

data

Contains .mat files generated by scripts in src

Organized in Figures

Filenames

These generaically take the form

fig_[number]_[row]_[col]

where [number] is the figure numbers and [row] and [col] are the row number and column number is multi-panel figures.

About

This is a repository associated with the paper "CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep Learning" by Ben Adcock, Juan M. Cardenas, and Nick Dexter submitted at Sampling Theory and Applications (SampTa), also available at https://arxiv.org/abs/2208.12190

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published