Skip to content

Commit 9a3ff71

Browse files
committed
prudence
1 parent 2c608e3 commit 9a3ff71

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

vignettes/duckplyr.Rmd

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -42,9 +42,9 @@ If the duckplyr data.frame is accessed by...
4242
Therefore, duckplyr can be **both lazy** (within itself) **and not lazy** (for the outside world).
4343

4444
Now, the default materialization can be problematic if dealing with large data: what if the materialization eats up all RAM?
45-
Therefore, the duckplyr package has a **safeguard called funneling**.
46-
A funneled data.frame cannot be materialized by default, it needs a call to a `compute()` function.
47-
By default, duckplyr frames are _unfunneled_, but duckplyr frames created from Parquet data (presumedly large) are _funneled_.
45+
Therefore, the duckplyr package has a **safeguard called prudence**.
46+
A prudent data.frame cannot be materialized by default, it needs a call to a `compute()` function.
47+
By default, duckplyr frames are _lavish_, but duckplyr frames created from Parquet data (presumedly large) are _frugal_.
4848

4949
## How to use duckplyr
5050

@@ -74,7 +74,7 @@ With large datasets, you want:
7474
- input data in an efficient format, like Parquet files. Therefore you might input data using `read_parquet_duckdb()`.
7575
- efficient computation, which duckplyr provides via DuckDB's holistic optimization, without your having to use another syntax than dplyr.
7676
- the output to not clutter all the memory. Therefore you can make use of these features:
77-
- funneling (see `vignette("funnel")`) to disable automatic materialization completely or to disable automatic materialization up to a certain output size.
77+
- prudence (see `vignette("funnel")`) to disable automatic materialization completely or to disable automatic materialization up to a certain output size.
7878
- computation to files using `compute_parquet()` or `compute_csv()`.
7979

8080

0 commit comments

Comments
 (0)