-
Notifications
You must be signed in to change notification settings - Fork 187
Open
Description
I am using bigrquery
to retrieve a table with 23 cols and over 3.3MM rows that includes all types of data (character, numeric, date).
api="json"
takes a bit to run but works OK:
> system.time(bq_results_json <- bq_table_download(bq_submit, bigint = "integer64", api = "json"))
Downloading first chunk of data.
Received 28,209 rows in the first chunk.
Downloading the remaining 3,340,910 rows in 158 chunks of (up to) 21,156 rows.
user system elapsed
53.490 7.913 66.507
api="arrow"
finishes faster:
> system.time(bq_results_arrw <- bq_table_download(bq_submit, bigint = "integer64", api = "arrow"))
Streamed 3369119 rows in 2646 messages.
user system elapsed
9.575 4.333 34.643
but it renders the R console useless for several minutes. In my last attempt it stayed frozen like this for nearly 10 minutes:

Any hints on what can be wrong?
Metadata
Metadata
Assignees
Labels
No labels