Skip to content

Commit 08caee6

Browse files
authored
Updates for TASL revisions and IWAENC paper (#6)
* Significant updates across MetaAF repo for revisions & code for IWAENC paper Increment wandb version, remove unused functions in higher order Update arxiv links in READMEs Update eval instruction, mention baselines, tweak run commands Move hometa aec implementations into zoo. Fix Liencse. * increment version to denote breaking changes 0.0.1 -> 1.0.0 * version increament to 1.0.0
1 parent 84ade68 commit 08caee6

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+3586
-677
lines changed

README.md

Lines changed: 47 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11

2-
32
<div align="center">
43

54
# Meta-AF: Meta-Learning for Adaptive Filters
5+
66
[Jonah Casebeer](https://jmcasebeer.github.io)<sup>1*</sup>, [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/)<sup>2</sup>, and [Paris Smaragdis](https://paris.cs.illinois.edu/)<sup>1</sup>
77

88
<sup>1</sup> Department of Computer Science, University of Illinois at Urbana-Champaign<br>
@@ -13,7 +13,7 @@
1313
[![Demo Video](https://ccrma.stanford.edu/~njb/index_files/metaaf-video-2022.png)](https://youtu.be/incb1QNSvW8)
1414

1515
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
16-
<!-- doctoc --maxlevel 2 README.md -->
16+
<!-- doctoc --maxlevel 2 README.md -->
1717
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
1818
**Table of Contents**
1919

@@ -28,43 +28,33 @@
2828
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
2929

3030
## Abstract
31-
Adaptive filtering algorithms are pervasive throughout modern society and have had a significant impact on a wide variety of domains including audio processing, biomedical sensing, astropyhysics, and many more. Adaptive filters typically operate via specialized online, iterative optimization methods but can be laborious to develop and require domain expertise. In this work, we frame the development of adaptive filters as a deep meta-learning problem and present a framework for learning online, adaptive signal processing algorithms or update rules directly from data using self-supervision. We focus on audio applications and apply our approach to system identification, acoustic echo cancellation, blind equalization, multi-channel dereverberation, and beamforming. For each application, we compare against common baselines and/or state-of-the-art methods and show we can learn high-performing adaptive filters that operate in real-time and, in most cases, significantly out perform specially developed methods for each task using a single general-purpose configuration of our method.
31+
32+
Adaptive filtering algorithms are pervasive throughout modern society and have had a significant impact on a wide variety of domains including audio processing, biomedical sensing, astropyhysics, and many more. Adaptive filters typically operate via specialized online, iterative optimization methods but can be laborious to develop and require domain expertise. In this work, we frame the development of adaptive filters as a deep meta-learning problem and present a framework for learning online, adaptive signal processing algorithms or update rules directly from data using self-supervision. We focus on audio applications and apply our approach to system identification, acoustic echo cancellation, blind equalization, multi-channel dereverberation, and beamforming. For each application, we compare against common baselines and/or state-of-the-art methods and show we can learn high-performing adaptive filters that operate in real-time and, in most cases, significantly out perform specially developed methods for each task using a single general-purpose configuration of our method.
3233

3334
For more details, please see:
34-
"[Meta-AF: Meta-Learning for Adaptive Filters](https://arxiv.org/abs/2204.11942)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), arXiv, 2022. If you use ideas or code from this work, pleace cite our paper:
35+
"[Meta-AF: Meta-Learning for Adaptive Filters](https://arxiv.org/abs/2204.11942)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), arXiv, 2022. If you use ideas or code from this work, please cite our paper:
3536

3637
```BibTex
37-
@article{casebeer2022metaaf,
38-
title={Meta-AF: Meta-Learning for Adaptive Filters},
39-
author={Jonah Casebeer and Nicholas J. Bryan and Paris Smaragdis},
40-
year={2022},
41-
eprint={2204.11942},
42-
archivePrefix={arXiv},
43-
primaryClass={cs.SD}
38+
@article{casebeer2022meta,
39+
title={Meta-AF: Meta-Learning for Adaptive Filters},
40+
author={Casebeer, Jonah and Bryan, Nicholas J and Smaragdis, Paris},
41+
journal={arXiv preprint arXiv:2204.11942},
42+
year={2022}
4443
}
4544
```
4645

47-
48-
4946
<!-- <div align="center">
5047
</br>
5148
52-
53-
5449
[**Demos**](#demos)
5550
| [**Code**](#code)
5651
| [**Meta-AF Zoo**](#meta-af-zoo)
5752
| [**License**](#license)
5853
| [**Related Works**](#related-works)
5954
60-
6155
</br>
6256
</div> -->
6357

64-
65-
66-
67-
6858
## Demos
6959

7060
For audio demonstrations of the work and `metaaf` package in action, please check out our [demo website](https://jmcasebeer.github.io/projects/metaaf). You'll be able to find demos for the five core adaptive filtering problems.
@@ -73,13 +63,14 @@ For audio demonstrations of the work and `metaaf` package in action, please chec
7363

7464
We open source all code for the work via our `metaaf` python pip package. Our `metaaf` package has functionality which enables meta-learning optimizers for near-arbitrary adaptive filters for any differentiable objective. `metaaf` automatically manages online overlap-save and overlap-add for single/multi channel and single/multi frame filters. We also include generic implementations of LMS, RMSProp, NLMS, and RLS for benchmarking purposes. Finally, `metaaf` includes implementation of generic GRU based optimizers, which are compatible with any filter defined in the `metaaf` format. Below, you can find example usage, usage for several common adaptive filter tasks (in the adaptive filter zoo), and installation instructions.
7565

76-
The `metaaf` package is relatively small, being limited to a dozen files which enable much more functionality than we demo here. The core meta-learning code is in `core.py`, the buffered and online filter implementations are in `filter.py`, and the RNN based optimizers are in `optimizer_gru.py` and `optimizer_fgru.py`. The remaining files hold utilities and generic implementations of baseline optimizers. `meta.py` contains a class for managing training.
66+
The `metaaf` package is relatively small, being limited to a dozen files which enable much more functionality than we demo here. The core meta-learning code is in `core.py`, the buffered and online filter implementations are in `filter.py`, and the RNN based optimizers are in `optimizer_gru.py` and `optimizer_fgru.py`. The remaining files hold utilities and generic implementations of baseline optimizers. `meta.py` contains a class for managing training.
7767

7868
### Installation
7969

8070
To install the `metaaf` python package, you will need a working JAX install. You can set one up by following the official directions [here](https://github.com/google/jax#installation). Below is an example of the commands we use to setup a new conda environment called `metaenv` in which we install `metaaf` and any dependencies.
8171

8272
#### GPU Setup
73+
8374
```{bash}
8475
### GPU
8576
# Install all the cuda and cudnn prerequisites
@@ -92,13 +83,14 @@ conda activate metaenv
9283
9384
# Actually install jax
9485
# You may need to change the cuda/cudnn version numbers depending on your machine
95-
pip install jax[cuda11_cudnn82]==0.3.7 -f https://storage.googleapis.com/jax-releases/jax_releases.html
86+
pip install jax[cuda11_cudnn82]==0.3.15 -f https://storage.googleapis.com/jax-releases/jax_releases.html
9687
9788
# Install Haiku
98-
pip install git+https://github.com/deepmind/[email protected].6
89+
pip install git+https://github.com/deepmind/[email protected].8
9990
```
10091

10192
#### CPU Setup
93+
10294
```{bash}
10395
### CPU. x86 only
10496
conda create -yn metaenv python=3.7 &&
@@ -109,13 +101,12 @@ conda activate metaenv
109101
# Actually install jax
110102
# You may need to change the cuda/cudnn version numbers depending on your machine
111103
pip install --upgrade pip
112-
pip install --upgrade "jax[cpu]"==0.3.7
104+
pip install --upgrade "jax[cpu]"==0.3.15
113105
114106
# Install Haiku
115-
pip install git+https://github.com/deepmind/[email protected].6
107+
pip install git+https://github.com/deepmind/[email protected].8
116108
```
117109

118-
119110
Finally, with the prerequisites done, you can install `metaaf` by cloning the repo, moving into the base directory, and running `pip install -e ./`. This `pip install` adds the remaining dependencies. To run the demo notebook, you also need to:
120111

121112
```{bash}
@@ -130,8 +121,6 @@ pip install matplotlib
130121
pip install ipywidgets
131122
```
132123

133-
134-
135124
### Example Usage
136125

137126
The `metaaf` package provides several important modules to facilitate training. The first is the `MetaAFTrainer`, a class which manages training. To use the `MetaAFTrainer`, we need to define a filter architecture, and a dataset. `metaaf` adopts several conventions to simplify training and automate procedures like buffering. In [this notebook](examples/sysid_demo.ipynb), we walk through this process and demonstrate on a toy system-identification task. In this section, we explain that toy-task and the automatic argparse utilities. To see a full-scale example, proceed to the next section, where we describe the Meta-AF Zoo.
@@ -215,12 +204,21 @@ def filter_loss(out, data_samples, metadata):
215204
return jnp.vdot(e, e) / (e.size)
216205
```
217206

218-
We can construct the meta-loss in a similar fashion.
207+
We can construct the meta-train and meta-val losses in a similar fashion.
219208

220209
```{python}
221-
def meta_loss(losses, outputs, data_samples, metadata, outer_learnable):
222-
EPS = 1e-9
223-
return jnp.log(jnp.mean(jnp.abs(outputs - data_samples["d"]) ** 2) + EPS)
210+
def meta_train_loss(losses, outputs, data_samples, metadata, outer_learnable):
211+
out = jnp.concatenate(outputs["out"], 0)
212+
return jnp.log(jnp.mean(jnp.abs(out - data_samples["d"]) ** 2) + 1e-9)
213+
214+
def meta_val_loss(losses, outputs, data_samples, metadata, outer_learnable):
215+
out = jnp.reshape(
216+
outputs["out"],
217+
(outputs["out"].shape[0], -1, outputs["out"].shape[-1]),
218+
)
219+
d = data_samples["d"]
220+
min_len = min(out.shape[1], d.shape[1])
221+
return jnp.log(jnp.mean(jnp.abs(out[:, :min_len] - d[:, :min_len]) ** 2) + 1e-9)
224222
```
225223

226224
With everything defined, we can setup the Meta-Trainer and start training.
@@ -239,8 +237,8 @@ system = MetaAFTrainer(
239237
_filter_fwd=_SystemID_fwd,
240238
filter_kwargs=SystemID.grab_args(kwargs),
241239
filter_loss=filter_loss,
242-
meta_train_loss=meta_loss,
243-
meta_val_loss=meta_loss,
240+
meta_train_loss=meta_train_loss,
241+
meta_val_loss=meta_val_loss,
244242
optimizer_kwargs=ElementWiseGRU.grab_args(kwargs),
245243
train_loader=train_loader,
246244
val_loader=val_loader,
@@ -258,21 +256,30 @@ That is it! For more advanced options check out the zoo, where we demonstrate ca
258256

259257
## Meta-AF Zoo
260258

261-
The [Meta-AF Zoo](zoo/README.md) contains implementations for system identification, acoustic echo cancellation, equalization, weighted predection error dereverberation, and a generalized sidelobe canceller beamformer all in the `metaaf` framework. You can find intructions for how to run, evaluate, and setup those models [here](zoo/README.md). For trained weights, please see the tagged release zip file [here](https://github.com/adobe-research/MetaAF/releases/tag/v0.1.0).
262-
263-
259+
The [Meta-AF Zoo](zoo/README.md) contains implementations for system identification, acoustic echo cancellation, equalization, weighted predection error dereverberation, and a generalized sidelobe canceller beamformer all in the `metaaf` framework. You can find intructions for how to run, evaluate, and setup those models [here](zoo/README.md). For trained weights, and tuned baselines, please see the tagged release zip file [here](https://github.com/adobe-research/MetaAF/releases/tag/v0.1.0).
264260

265261
## License
266262

267263
All core utility code within the `metaaf` folder is licensed via the [University of Illinois Open Source License](metaaf/LICENSE). All code within the `zoo` folder and model weights are licensed via the [Adobe Research License](zoo/LICENSE). Copyright (c) Adobe Systems Incorporated. All rights reserved.
268264

265+
## Related Works
269266

267+
An extension of this work using `metaaf` [here](zoo/hometa_aec/README.md):
270268

271-
## Related Works
269+
"[Meta-Learning for Adaptive Filters with Higher-Order Frequency Dependencies](https://arxiv.org/abs/2209.09955)", [Junkai Wu](https://www.linkedin.com/in/junkai-wu-19015b198/), [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), IWAENC, 2022.
270+
271+
```BibTex
272+
@article{wu2022metalearning,
273+
title={Meta-Learning for Adaptive Filters with Higher-Order Frequency Dependencies},
274+
author={Wu, Junkai and Casebeer, Jonah and Bryan, Nicholas J. and Smaragdis, Paris},
275+
journal={arXiv preprint arXiv:2209.09955},
276+
year={2022},
277+
}
278+
```
272279

273-
Please also see an early version of this work:
280+
An early version of this work:
274281

275-
"[Auto-DSP: Learning to Optimize Acoustic Echo Cancellers](https://arxiv.org/abs/2110.04284)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), arXiv, 2022.
282+
"[Auto-DSP: Learning to Optimize Acoustic Echo Cancellers](https://arxiv.org/abs/2110.04284)", [Jonah Casebeer](https://jmcasebeer.github.io), [Nicholas J. Bryan](https://ccrma.stanford.edu/~njb/), and [Paris Smaragdis](https://paris.cs.illinois.edu/), WASPAA, 2021.
276283

277284
```BibTex
278285
@inproceedings{casebeer2021auto,

0 commit comments

Comments
 (0)