Skip to content

Commit 2ccc2f0

Browse files
committed
Normalize whitespace in R cookbooks
1 parent 59c98c2 commit 2ccc2f0

File tree

2 files changed

+9
-10
lines changed

2 files changed

+9
-10
lines changed

r/content/specify_data_types_and_schemas.Rmd

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -21,19 +21,19 @@
2121

2222
## Introduction
2323

24-
As discussed in previous chapters, Arrow automatically infers the most
25-
appropriate data type when reading in data or converting R objects to Arrow
26-
objects. However, you might want to manually tell Arrow which data types to
27-
use, for example, to ensure interoperability with databases and data warehouse
24+
As discussed in previous chapters, Arrow automatically infers the most
25+
appropriate data type when reading in data or converting R objects to Arrow
26+
objects. However, you might want to manually tell Arrow which data types to
27+
use, for example, to ensure interoperability with databases and data warehouse
2828
systems. This chapter includes recipes for:
2929

3030
* changing the data types of existing Arrow objects
3131
* defining data types during the process of creating Arrow objects
3232

33-
A table showing the default mappings between R and Arrow data types can be found
33+
A table showing the default mappings between R and Arrow data types can be found
3434
in [R data type to Arrow data type mappings](https://arrow.apache.org/docs/r/articles/arrow.html#r-to-arrow).
3535

36-
A table containing Arrow data types, and their R equivalents can be found in
36+
A table containing Arrow data types, and their R equivalents can be found in
3737
[Arrow data type to R data type mapping](https://arrow.apache.org/docs/r/articles/arrow.html#arrow-to-r).
3838

3939
## Update data type of an existing Arrow Array
@@ -63,7 +63,7 @@ test_that("cast_array works as expected", {
6363

6464
### Discussion
6565

66-
There are some data types which are not compatible with each other. Errors will
66+
There are some data types which are not compatible with each other. Errors will
6767
occur if you try to cast between incompatible data types.
6868

6969
```{r, incompat, eval = FALSE}
@@ -122,7 +122,7 @@ test_that("cast_table works as expected", {
122122

123123
## Specify data types when creating an Arrow table from an R object
124124

125-
You want to manually specify Arrow data types when converting an object from a
125+
You want to manually specify Arrow data types when converting an object from a
126126
data frame to an Arrow object.
127127

128128
### Solution
@@ -187,4 +187,3 @@ test_that("use_schema_dataset works as expected", {
187187
```{r, include=FALSE}
188188
unlink("oscars_data", recursive = TRUE)
189189
```
190-

r/content/tables.Rmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -413,4 +413,4 @@ You can perform these window aggregate operations on Arrow tables by:
413413
- Computing the aggregation separately, and joining the result
414414
- Passing the data to DuckDB, and use the DuckDB query engine to perform the operations
415415

416-
Arrow supports zero-copy integration with DuckDB, and DuckDB can query Arrow datasets directly and stream query results back to Arrow. This integreation uses zero-copy streaming of data between DuckDB and Arrow and vice versa so that you can compose a query using both together, all the while not paying any cost to (re)serialize the data when you pass it back and forth. This is especially useful in cases where something is supported in one of Arrow or DuckDB query engines but not the other. You can find more information about this integration on the [Arrow blog post](https://arrow.apache.org/blog/2021/12/03/arrow-duckdb/).
416+
Arrow supports zero-copy integration with DuckDB, and DuckDB can query Arrow datasets directly and stream query results back to Arrow. This integreation uses zero-copy streaming of data between DuckDB and Arrow and vice versa so that you can compose a query using both together, all the while not paying any cost to (re)serialize the data when you pass it back and forth. This is especially useful in cases where something is supported in one of Arrow or DuckDB query engines but not the other. You can find more information about this integration on the [Arrow blog post](https://arrow.apache.org/blog/2021/12/03/arrow-duckdb/).

0 commit comments

Comments
 (0)