You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Significant updates across MetaAF repo for revisions & code for IWAENC paper
Increment wandb version, remove unused functions in higher order
Update arxiv links in READMEs
Update eval instruction, mention baselines, tweak run commands
Move hometa aec implementations into zoo. Fix Liencse.
* increment version to denote breaking changes 0.0.1 -> 1.0.0
* version increament to 1.0.0
Copy file name to clipboardExpand all lines: README.md
+47-40Lines changed: 47 additions & 40 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
2
-
3
2
<divalign="center">
4
3
5
4
# Meta-AF: Meta-Learning for Adaptive Filters
5
+
6
6
[Jonah Casebeer](https://jmcasebeer.github.io)<sup>1*</sup>, [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/)<sup>2</sup>, and [Paris Smaragdis](https://paris.cs.illinois.edu/)<sup>1</sup>
7
7
8
8
<sup>1</sup> Department of Computer Science, University of Illinois at Urbana-Champaign<br>
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
16
-
<!-- doctoc --maxlevel 2 README.md -->
16
+
<!-- doctoc --maxlevel 2 README.md -->
17
17
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
18
18
**Table of Contents**
19
19
@@ -28,43 +28,33 @@
28
28
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
29
29
30
30
## Abstract
31
-
Adaptive filtering algorithms are pervasive throughout modern society and have had a significant impact on a wide variety of domains including audio processing, biomedical sensing, astropyhysics, and many more. Adaptive filters typically operate via specialized online, iterative optimization methods but can be laborious to develop and require domain expertise. In this work, we frame the development of adaptive filters as a deep meta-learning problem and present a framework for learning online, adaptive signal processing algorithms or update rules directly from data using self-supervision. We focus on audio applications and apply our approach to system identification, acoustic echo cancellation, blind equalization, multi-channel dereverberation, and beamforming. For each application, we compare against common baselines and/or state-of-the-art methods and show we can learn high-performing adaptive filters that operate in real-time and, in most cases, significantly out perform specially developed methods for each task using a single general-purpose configuration of our method.
31
+
32
+
Adaptive filtering algorithms are pervasive throughout modern society and have had a significant impact on a wide variety of domains including audio processing, biomedical sensing, astropyhysics, and many more. Adaptive filters typically operate via specialized online, iterative optimization methods but can be laborious to develop and require domain expertise. In this work, we frame the development of adaptive filters as a deep meta-learning problem and present a framework for learning online, adaptive signal processing algorithms or update rules directly from data using self-supervision. We focus on audio applications and apply our approach to system identification, acoustic echo cancellation, blind equalization, multi-channel dereverberation, and beamforming. For each application, we compare against common baselines and/or state-of-the-art methods and show we can learn high-performing adaptive filters that operate in real-time and, in most cases, significantly out perform specially developed methods for each task using a single general-purpose configuration of our method.
32
33
33
34
For more details, please see:
34
-
"[Meta-AF: Meta-Learning for Adaptive Filters](https://arxiv.org/abs/2204.11942)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), arXiv, 2022. If you use ideas or code from this work, pleace cite our paper:
35
+
"[Meta-AF: Meta-Learning for Adaptive Filters](https://arxiv.org/abs/2204.11942)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), arXiv, 2022. If you use ideas or code from this work, please cite our paper:
35
36
36
37
```BibTex
37
-
@article{casebeer2022metaaf,
38
-
title={Meta-AF: Meta-Learning for Adaptive Filters},
39
-
author={Jonah Casebeer and Nicholas J. Bryan and Paris Smaragdis},
40
-
year={2022},
41
-
eprint={2204.11942},
42
-
archivePrefix={arXiv},
43
-
primaryClass={cs.SD}
38
+
@article{casebeer2022meta,
39
+
title={Meta-AF: Meta-Learning for Adaptive Filters},
40
+
author={Casebeer, Jonah and Bryan, Nicholas J and Smaragdis, Paris},
41
+
journal={arXiv preprint arXiv:2204.11942},
42
+
year={2022}
44
43
}
45
44
```
46
45
47
-
48
-
49
46
<!-- <div align="center">
50
47
</br>
51
48
52
-
53
-
54
49
[**Demos**](#demos)
55
50
| [**Code**](#code)
56
51
| [**Meta-AF Zoo**](#meta-af-zoo)
57
52
| [**License**](#license)
58
53
| [**Related Works**](#related-works)
59
54
60
-
61
55
</br>
62
56
</div> -->
63
57
64
-
65
-
66
-
67
-
68
58
## Demos
69
59
70
60
For audio demonstrations of the work and `metaaf` package in action, please check out our [demo website](https://jmcasebeer.github.io/projects/metaaf). You'll be able to find demos for the five core adaptive filtering problems.
@@ -73,13 +63,14 @@ For audio demonstrations of the work and `metaaf` package in action, please chec
73
63
74
64
We open source all code for the work via our `metaaf` python pip package. Our `metaaf` package has functionality which enables meta-learning optimizers for near-arbitrary adaptive filters for any differentiable objective. `metaaf` automatically manages online overlap-save and overlap-add for single/multi channel and single/multi frame filters. We also include generic implementations of LMS, RMSProp, NLMS, and RLS for benchmarking purposes. Finally, `metaaf` includes implementation of generic GRU based optimizers, which are compatible with any filter defined in the `metaaf` format. Below, you can find example usage, usage for several common adaptive filter tasks (in the adaptive filter zoo), and installation instructions.
75
65
76
-
The `metaaf` package is relatively small, being limited to a dozen files which enable much more functionality than we demo here. The core meta-learning code is in `core.py`, the buffered and online filter implementations are in `filter.py`, and the RNN based optimizers are in `optimizer_gru.py` and `optimizer_fgru.py`. The remaining files hold utilities and generic implementations of baseline optimizers. `meta.py` contains a class for managing training.
66
+
The `metaaf` package is relatively small, being limited to a dozen files which enable much more functionality than we demo here. The core meta-learning code is in `core.py`, the buffered and online filter implementations are in `filter.py`, and the RNN based optimizers are in `optimizer_gru.py` and `optimizer_fgru.py`. The remaining files hold utilities and generic implementations of baseline optimizers. `meta.py` contains a class for managing training.
77
67
78
68
### Installation
79
69
80
70
To install the `metaaf` python package, you will need a working JAX install. You can set one up by following the official directions [here](https://github.com/google/jax#installation). Below is an example of the commands we use to setup a new conda environment called `metaenv` in which we install `metaaf` and any dependencies.
81
71
82
72
#### GPU Setup
73
+
83
74
```{bash}
84
75
### GPU
85
76
# Install all the cuda and cudnn prerequisites
@@ -92,13 +83,14 @@ conda activate metaenv
92
83
93
84
# Actually install jax
94
85
# You may need to change the cuda/cudnn version numbers depending on your machine
Finally, with the prerequisites done, you can install `metaaf` by cloning the repo, moving into the base directory, and running `pip install -e ./`. This `pip install` adds the remaining dependencies. To run the demo notebook, you also need to:
120
111
121
112
```{bash}
@@ -130,8 +121,6 @@ pip install matplotlib
130
121
pip install ipywidgets
131
122
```
132
123
133
-
134
-
135
124
### Example Usage
136
125
137
126
The `metaaf` package provides several important modules to facilitate training. The first is the `MetaAFTrainer`, a class which manages training. To use the `MetaAFTrainer`, we need to define a filter architecture, and a dataset. `metaaf` adopts several conventions to simplify training and automate procedures like buffering. In [this notebook](examples/sysid_demo.ipynb), we walk through this process and demonstrate on a toy system-identification task. In this section, we explain that toy-task and the automatic argparse utilities. To see a full-scale example, proceed to the next section, where we describe the Meta-AF Zoo.
@@ -258,21 +256,30 @@ That is it! For more advanced options check out the zoo, where we demonstrate ca
258
256
259
257
## Meta-AF Zoo
260
258
261
-
The [Meta-AF Zoo](zoo/README.md) contains implementations for system identification, acoustic echo cancellation, equalization, weighted predection error dereverberation, and a generalized sidelobe canceller beamformer all in the `metaaf` framework. You can find intructions for how to run, evaluate, and setup those models [here](zoo/README.md). For trained weights, please see the tagged release zip file [here](https://github.com/adobe-research/MetaAF/releases/tag/v0.1.0).
262
-
263
-
259
+
The [Meta-AF Zoo](zoo/README.md) contains implementations for system identification, acoustic echo cancellation, equalization, weighted predection error dereverberation, and a generalized sidelobe canceller beamformer all in the `metaaf` framework. You can find intructions for how to run, evaluate, and setup those models [here](zoo/README.md). For trained weights, and tuned baselines, please see the tagged release zip file [here](https://github.com/adobe-research/MetaAF/releases/tag/v0.1.0).
264
260
265
261
## License
266
262
267
263
All core utility code within the `metaaf` folder is licensed via the [University of Illinois Open Source License](metaaf/LICENSE). All code within the `zoo` folder and model weights are licensed via the [Adobe Research License](zoo/LICENSE). Copyright (c) Adobe Systems Incorporated. All rights reserved.
268
264
265
+
## Related Works
269
266
267
+
An extension of this work using `metaaf`[here](zoo/hometa_aec/README.md):
270
268
271
-
## Related Works
269
+
"[Meta-Learning for Adaptive Filters with Higher-Order Frequency Dependencies](https://arxiv.org/abs/2209.09955)", [Junkai Wu](https://www.linkedin.com/in/junkai-wu-19015b198/), [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), IWAENC, 2022.
270
+
271
+
```BibTex
272
+
@article{wu2022metalearning,
273
+
title={Meta-Learning for Adaptive Filters with Higher-Order Frequency Dependencies},
274
+
author={Wu, Junkai and Casebeer, Jonah and Bryan, Nicholas J. and Smaragdis, Paris},
275
+
journal={arXiv preprint arXiv:2209.09955},
276
+
year={2022},
277
+
}
278
+
```
272
279
273
-
Please also see an early version of this work:
280
+
An early version of this work:
274
281
275
-
"[Auto-DSP: Learning to Optimize Acoustic Echo Cancellers](https://arxiv.org/abs/2110.04284)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), arXiv, 2022.
282
+
"[Auto-DSP: Learning to Optimize Acoustic Echo Cancellers](https://arxiv.org/abs/2110.04284)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), WASPAA, 2021.
0 commit comments