- 3.39.0 (latest)
- 3.38.0
- 3.37.0
- 3.36.0
- 3.35.1
- 3.34.0
- 3.33.0
- 3.31.0
- 3.30.0
- 3.29.0
- 3.27.0
- 3.26.0
- 3.25.0
- 3.24.0
- 3.23.1
- 3.22.0
- 3.21.0
- 3.20.1
- 3.19.0
- 3.18.0
- 3.17.2
- 3.16.0
- 3.15.0
- 3.14.1
- 3.13.0
- 3.12.0
- 3.11.4
- 3.4.0
- 3.3.6
- 3.2.0
- 3.1.0
- 3.0.1
- 2.34.4
- 2.33.0
- 2.32.0
- 2.31.0
- 2.30.1
- 2.29.0
- 2.28.1
- 2.27.1
- 2.26.0
- 2.25.2
- 2.24.1
- 2.23.3
- 2.22.1
- 2.21.0
- 2.20.0
- 2.19.0
- 2.18.0
- 2.17.0
- 2.16.1
- 2.15.0
- 2.14.0
- 2.13.1
- 2.12.0
- 2.11.0
- 2.10.0
- 2.9.0
- 2.8.0
- 2.7.0
- 2.6.2
- 2.5.0
- 2.4.0
- 2.3.1
- 2.2.0
- 2.1.0
- 2.0.0
- 1.28.2
- 1.27.2
- 1.26.1
- 1.25.0
- 1.24.0
- 1.23.1
- 1.22.0
- 1.21.0
- 1.20.0
- 1.19.0
- 1.18.0
- 1.17.0
- 1.16.0
Summary of entries of Methods for bigquery.
google.cloud.bigquery.dbapi.Binary
Binary(data)Contruct a DB-API binary value.
See more: google.cloud.bigquery.dbapi.Binary
google.cloud.bigquery.dbapi.DateFromTicks
DateFromTicks(timestamp, /)Create a date from a POSIX timestamp.
google.cloud.bigquery.dbapi.TimeFromTicks
TimeFromTicks(ticks, tz=None)Construct a DB-API time value from the given ticks value.
google.cloud.bigquery.dbapi.TimestampFromTicks
timestamp[, tz] -> tz's local time from POSIX timestamp.
google.cloud.bigquery.dbapi.connect
connect(client=None, bqstorage_client=None, prefer_bqstorage_client=True)Construct a DB-API connection to Google BigQuery.
See more: google.cloud.bigquery.dbapi.connect
google.cloud.bigquery.client.Client.cancel_job
cancel_job(
job_id: str,
project: typing.Optional[str] = None,
location: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> typing.Union[
google.cloud.bigquery.job.load.LoadJob,
google.cloud.bigquery.job.copy_.CopyJob,
google.cloud.bigquery.job.extract.ExtractJob,
google.cloud.bigquery.job.query.QueryJob,
]Attempt to cancel a job from a job ID.
google.cloud.bigquery.client.Client.close
close()Close the underlying transport objects, releasing system resources.
google.cloud.bigquery.client.Client.copy_table
copy_table(
sources: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
typing.Sequence[
typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
]
],
],
destination: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
job_id: typing.Optional[str] = None,
job_id_prefix: typing.Optional[str] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
job_config: typing.Optional[google.cloud.bigquery.job.copy_.CopyJobConfig] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.job.copy_.CopyJobCopy one or more tables to another table.
google.cloud.bigquery.client.Client.create_dataset
create_dataset(
dataset: typing.Union[
str,
google.cloud.bigquery.dataset.Dataset,
google.cloud.bigquery.dataset.DatasetReference,
google.cloud.bigquery.dataset.DatasetListItem,
],
exists_ok: bool = False,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.dataset.DatasetAPI call: create the dataset via a POST request.
See more: google.cloud.bigquery.client.Client.create_dataset
google.cloud.bigquery.client.Client.create_job
create_job(
job_config: dict,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> typing.Union[
google.cloud.bigquery.job.load.LoadJob,
google.cloud.bigquery.job.copy_.CopyJob,
google.cloud.bigquery.job.extract.ExtractJob,
google.cloud.bigquery.job.query.QueryJob,
]Create a new job.
google.cloud.bigquery.client.Client.create_routine
create_routine(
routine: google.cloud.bigquery.routine.routine.Routine,
exists_ok: bool = False,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.routine.routine.Routine[Beta] Create a routine via a POST request.
See more: google.cloud.bigquery.client.Client.create_routine
google.cloud.bigquery.client.Client.create_table
create_table(
table: typing.Union[
str,
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
],
exists_ok: bool = False,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.table.TableAPI call: create a table via a PUT request.
google.cloud.bigquery.client.Client.dataset
dataset(
dataset_id: str, project: typing.Optional[str] = None
) -> google.cloud.bigquery.dataset.DatasetReferenceDeprecated: Construct a reference to a dataset.
google.cloud.bigquery.client.Client.delete_dataset
delete_dataset(
dataset: typing.Union[
google.cloud.bigquery.dataset.Dataset,
google.cloud.bigquery.dataset.DatasetReference,
google.cloud.bigquery.dataset.DatasetListItem,
str,
],
delete_contents: bool = False,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
not_found_ok: bool = False,
) -> NoneDelete a dataset.
See more: google.cloud.bigquery.client.Client.delete_dataset
google.cloud.bigquery.client.Client.delete_job_metadata
delete_job_metadata(
job_id: typing.Union[
str,
google.cloud.bigquery.job.load.LoadJob,
google.cloud.bigquery.job.copy_.CopyJob,
google.cloud.bigquery.job.extract.ExtractJob,
google.cloud.bigquery.job.query.QueryJob,
],
project: typing.Optional[str] = None,
location: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
not_found_ok: bool = False,
)[Beta] Delete job metadata from job history.
See more: google.cloud.bigquery.client.Client.delete_job_metadata
google.cloud.bigquery.client.Client.delete_model
delete_model(
model: typing.Union[
google.cloud.bigquery.model.Model,
google.cloud.bigquery.model.ModelReference,
str,
],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
not_found_ok: bool = False,
) -> None[Beta] Delete a model.
google.cloud.bigquery.client.Client.delete_routine
delete_routine(
routine: typing.Union[
google.cloud.bigquery.routine.routine.Routine,
google.cloud.bigquery.routine.routine.RoutineReference,
str,
],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
not_found_ok: bool = False,
) -> None[Beta] Delete a routine.
See more: google.cloud.bigquery.client.Client.delete_routine
google.cloud.bigquery.client.Client.delete_table
delete_table(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
not_found_ok: bool = False,
) -> NoneDelete a table.
google.cloud.bigquery.client.Client.extract_table
extract_table(
source: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
google.cloud.bigquery.model.Model,
google.cloud.bigquery.model.ModelReference,
str,
],
destination_uris: typing.Union[str, typing.Sequence[str]],
job_id: typing.Optional[str] = None,
job_id_prefix: typing.Optional[str] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
job_config: typing.Optional[
google.cloud.bigquery.job.extract.ExtractJobConfig
] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
source_type: str = "Table",
) -> google.cloud.bigquery.job.extract.ExtractJobStart a job to extract a table into Cloud Storage files.
google.cloud.bigquery.client.Client.get_dataset
get_dataset(
dataset_ref: typing.Union[google.cloud.bigquery.dataset.DatasetReference, str],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
dataset_view: typing.Optional[google.cloud.bigquery.enums.DatasetView] = None,
) -> google.cloud.bigquery.dataset.DatasetFetch the dataset referenced by dataset_ref
.
google.cloud.bigquery.client.Client.get_iam_policy
get_iam_policy(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
requested_policy_version: int = 1,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.api_core.iam.PolicyReturn the access control policy for a table resource.
See more: google.cloud.bigquery.client.Client.get_iam_policy
google.cloud.bigquery.client.Client.get_job
get_job(
job_id: typing.Union[
str,
google.cloud.bigquery.job.load.LoadJob,
google.cloud.bigquery.job.copy_.CopyJob,
google.cloud.bigquery.job.extract.ExtractJob,
google.cloud.bigquery.job.query.QueryJob,
],
project: typing.Optional[str] = None,
location: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
) -> typing.Union[
google.cloud.bigquery.job.load.LoadJob,
google.cloud.bigquery.job.copy_.CopyJob,
google.cloud.bigquery.job.extract.ExtractJob,
google.cloud.bigquery.job.query.QueryJob,
google.cloud.bigquery.job.base.UnknownJob,
]Fetch a job for the project associated with this client.
google.cloud.bigquery.client.Client.get_model
get_model(
model_ref: typing.Union[google.cloud.bigquery.model.ModelReference, str],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.model.Model[Beta] Fetch the model referenced by model_ref.
google.cloud.bigquery.client.Client.get_routine
get_routine(
routine_ref: typing.Union[
google.cloud.bigquery.routine.routine.Routine,
google.cloud.bigquery.routine.routine.RoutineReference,
str,
],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.routine.routine.Routine[Beta] Get the routine referenced by routine_ref.
google.cloud.bigquery.client.Client.get_service_account_email
get_service_account_email(
project: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> strGet the email address of the project's BigQuery service account.
See more: google.cloud.bigquery.client.Client.get_service_account_email
google.cloud.bigquery.client.Client.get_table
get_table(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.table.TableFetch the table referenced by table.
google.cloud.bigquery.client.Client.insert_rows
insert_rows(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
str,
],
rows: typing.Union[
typing.Iterable[typing.Tuple], typing.Iterable[typing.Mapping[str, typing.Any]]
],
selected_fields: typing.Optional[
typing.Sequence[google.cloud.bigquery.schema.SchemaField]
] = None,
**kwargs
) -> typing.Sequence[typing.Dict[str, typing.Any]]Insert rows into a table via the streaming API.
google.cloud.bigquery.client.Client.insert_rows_from_dataframe
insert_rows_from_dataframe(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
str,
],
dataframe,
selected_fields: typing.Optional[
typing.Sequence[google.cloud.bigquery.schema.SchemaField]
] = None,
chunk_size: int = 500,
**kwargs: typing.Dict
) -> typing.Sequence[typing.Sequence[dict]]Insert rows into a table from a dataframe via the streaming API.
See more: google.cloud.bigquery.client.Client.insert_rows_from_dataframe
google.cloud.bigquery.client.Client.insert_rows_json
insert_rows_json(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
json_rows: typing.Sequence[typing.Mapping[str, typing.Any]],
row_ids: typing.Optional[
typing.Union[
typing.Iterable[typing.Optional[str]],
google.cloud.bigquery.enums.AutoRowIDs,
]
] = AutoRowIDs.GENERATE_UUID,
skip_invalid_rows: typing.Optional[bool] = None,
ignore_unknown_values: typing.Optional[bool] = None,
template_suffix: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> typing.Sequence[dict]Insert rows into a table without applying local type conversions.
See more: google.cloud.bigquery.client.Client.insert_rows_json
google.cloud.bigquery.client.Client.job_from_resource
job_from_resource(
resource: dict,
) -> typing.Union[
google.cloud.bigquery.job.copy_.CopyJob,
google.cloud.bigquery.job.extract.ExtractJob,
google.cloud.bigquery.job.load.LoadJob,
google.cloud.bigquery.job.query.QueryJob,
google.cloud.bigquery.job.base.UnknownJob,
]Detect correct job type from resource and instantiate.
See more: google.cloud.bigquery.client.Client.job_from_resource
google.cloud.bigquery.client.Client.list_datasets
list_datasets(
project: typing.Optional[str] = None,
include_all: bool = False,
filter: typing.Optional[str] = None,
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
page_size: typing.Optional[int] = None,
) -> google.api_core.page_iterator.IteratorList datasets for the project associated with this client.
google.cloud.bigquery.client.Client.list_jobs
list_jobs(
project: typing.Optional[str] = None,
parent_job: typing.Optional[
typing.Union[google.cloud.bigquery.job.query.QueryJob, str]
] = None,
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
all_users: typing.Optional[bool] = None,
state_filter: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
min_creation_time: typing.Optional[datetime.datetime] = None,
max_creation_time: typing.Optional[datetime.datetime] = None,
page_size: typing.Optional[int] = None,
) -> google.api_core.page_iterator.IteratorList jobs for the project associated with this client.
google.cloud.bigquery.client.Client.list_models
list_models(
dataset: typing.Union[
google.cloud.bigquery.dataset.Dataset,
google.cloud.bigquery.dataset.DatasetReference,
google.cloud.bigquery.dataset.DatasetListItem,
str,
],
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
page_size: typing.Optional[int] = None,
) -> google.api_core.page_iterator.Iterator[Beta] List models in the dataset.
google.cloud.bigquery.client.Client.list_partitions
list_partitions(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> typing.Sequence[str]List the partitions in a table.
See more: google.cloud.bigquery.client.Client.list_partitions
google.cloud.bigquery.client.Client.list_projects
list_projects(
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
page_size: typing.Optional[int] = None,
) -> google.api_core.page_iterator.IteratorList projects for the project associated with this client.
google.cloud.bigquery.client.Client.list_routines
list_routines(
dataset: typing.Union[
google.cloud.bigquery.dataset.Dataset,
google.cloud.bigquery.dataset.DatasetReference,
google.cloud.bigquery.dataset.DatasetListItem,
str,
],
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
page_size: typing.Optional[int] = None,
) -> google.api_core.page_iterator.Iterator[Beta] List routines in the dataset.
google.cloud.bigquery.client.Client.list_rows
list_rows(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableListItem,
google.cloud.bigquery.table.TableReference,
str,
],
selected_fields: typing.Optional[
typing.Sequence[google.cloud.bigquery.schema.SchemaField]
] = None,
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
start_index: typing.Optional[int] = None,
page_size: typing.Optional[int] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.table.RowIteratorList the rows of the table.
google.cloud.bigquery.client.Client.list_tables
list_tables(
dataset: typing.Union[
google.cloud.bigquery.dataset.Dataset,
google.cloud.bigquery.dataset.DatasetReference,
google.cloud.bigquery.dataset.DatasetListItem,
str,
],
max_results: typing.Optional[int] = None,
page_token: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
page_size: typing.Optional[int] = None,
) -> google.api_core.page_iterator.IteratorList tables in the dataset.
google.cloud.bigquery.client.Client.load_table_from_dataframe
load_table_from_dataframe(
dataframe: pandas.DataFrame,
destination: Union[Table, TableReference, str],
num_retries: int = 6,
job_id: Optional[str] = None,
job_id_prefix: Optional[str] = None,
location: Optional[str] = None,
project: Optional[str] = None,
job_config: Optional[LoadJobConfig] = None,
parquet_compression: str = "snappy",
timeout: ResumableTimeoutType = None,
) -> job.LoadJobUpload the contents of a table from a pandas DataFrame.
See more: google.cloud.bigquery.client.Client.load_table_from_dataframe
google.cloud.bigquery.client.Client.load_table_from_file
load_table_from_file(
file_obj: typing.IO[bytes],
destination: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
rewind: bool = False,
size: typing.Optional[int] = None,
num_retries: int = 6,
job_id: typing.Optional[str] = None,
job_id_prefix: typing.Optional[str] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
job_config: typing.Optional[google.cloud.bigquery.job.load.LoadJobConfig] = None,
timeout: typing.Union[None, float, typing.Tuple[float, float]] = None,
) -> google.cloud.bigquery.job.load.LoadJobUpload the contents of this table from a file-like object.
See more: google.cloud.bigquery.client.Client.load_table_from_file
google.cloud.bigquery.client.Client.load_table_from_json
load_table_from_json(
json_rows: typing.Iterable[typing.Dict[str, typing.Any]],
destination: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
num_retries: int = 6,
job_id: typing.Optional[str] = None,
job_id_prefix: typing.Optional[str] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
job_config: typing.Optional[google.cloud.bigquery.job.load.LoadJobConfig] = None,
timeout: typing.Union[None, float, typing.Tuple[float, float]] = None,
) -> google.cloud.bigquery.job.load.LoadJobUpload the contents of a table from a JSON string or dict.
See more: google.cloud.bigquery.client.Client.load_table_from_json
google.cloud.bigquery.client.Client.load_table_from_uri
load_table_from_uri(
source_uris: typing.Union[str, typing.Sequence[str]],
destination: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
job_id: typing.Optional[str] = None,
job_id_prefix: typing.Optional[str] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
job_config: typing.Optional[google.cloud.bigquery.job.load.LoadJobConfig] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.job.load.LoadJobStarts a job for loading data into a table from Cloud Storage.
See more: google.cloud.bigquery.client.Client.load_table_from_uri
google.cloud.bigquery.client.Client.query
query(
query: str,
job_config: typing.Optional[google.cloud.bigquery.job.query.QueryJobConfig] = None,
job_id: typing.Optional[str] = None,
job_id_prefix: typing.Optional[str] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
job_retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
api_method: typing.Union[
str, google.cloud.bigquery.enums.QueryApiMethod
] = QueryApiMethod.INSERT,
) -> google.cloud.bigquery.job.query.QueryJobRun a SQL query.
google.cloud.bigquery.client.Client.query_and_wait
query_and_wait(
query,
*,
job_config: typing.Optional[google.cloud.bigquery.job.query.QueryJobConfig] = None,
location: typing.Optional[str] = None,
project: typing.Optional[str] = None,
api_timeout: typing.Optional[float] = None,
wait_timeout: typing.Union[float, None, object] = object,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
job_retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
page_size: typing.Optional[int] = None,
max_results: typing.Optional[int] = None
) -> google.cloud.bigquery.table.RowIteratorRun the query, wait for it to finish, and return the results.
See more: google.cloud.bigquery.client.Client.query_and_wait
google.cloud.bigquery.client.Client.schema_from_json
schema_from_json(file_or_path: PathType) -> List[SchemaField]Takes a file object or file path that contains json that describes a table schema.
See more: google.cloud.bigquery.client.Client.schema_from_json
google.cloud.bigquery.client.Client.schema_to_json
schema_to_json(schema_list: Sequence[SchemaField], destination: PathType)Takes a list of schema field objects.
See more: google.cloud.bigquery.client.Client.schema_to_json
google.cloud.bigquery.client.Client.set_iam_policy
set_iam_policy(
table: typing.Union[
google.cloud.bigquery.table.Table,
google.cloud.bigquery.table.TableReference,
google.cloud.bigquery.table.TableListItem,
str,
],
policy: google.api_core.iam.Policy,
updateMask: typing.Optional[str] = None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
*,
fields: typing.Sequence[str] = ()
) -> google.api_core.iam.PolicyReturn the access control policy for a table resource.
See more: google.cloud.bigquery.client.Client.set_iam_policy
google.cloud.bigquery.client.Client.update_dataset
update_dataset(
dataset: google.cloud.bigquery.dataset.Dataset,
fields: typing.Sequence[str],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
update_mode: typing.Optional[google.cloud.bigquery.enums.UpdateMode] = None,
) -> google.cloud.bigquery.dataset.DatasetChange some fields of a dataset.
See more: google.cloud.bigquery.client.Client.update_dataset
google.cloud.bigquery.client.Client.update_model
update_model(
model: google.cloud.bigquery.model.Model,
fields: typing.Sequence[str],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.model.Model[Beta] Change some fields of a model.
google.cloud.bigquery.client.Client.update_routine
update_routine(
routine: google.cloud.bigquery.routine.routine.Routine,
fields: typing.Sequence[str],
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.routine.routine.Routine[Beta] Change some fields of a routine.
See more: google.cloud.bigquery.client.Client.update_routine
google.cloud.bigquery.client.Client.update_table
update_table(
table: google.cloud.bigquery.table.Table,
fields: typing.Sequence[str],
autodetect_schema: bool = False,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.table.TableChange some fields of a table.
google.cloud.bigquery.client.Project.from_api_repr
from_api_repr(resource)Factory: construct an instance from a resource dict.
See more: google.cloud.bigquery.client.Project.from_api_repr
google.cloud.bigquery.dataset.AccessEntry.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.dataset.AccessEntryFactory: construct an access entry given its API representation .
See more: google.cloud.bigquery.dataset.AccessEntry.from_api_repr
google.cloud.bigquery.dataset.AccessEntry.to_api_repr
to_api_repr()Construct the API resource representation of this access entry .
See more: google.cloud.bigquery.dataset.AccessEntry.to_api_repr
google.cloud.bigquery.dataset.Condition.__eq__
__eq__(other: object) -> boolCheck for equality based on expression, title, and description.
google.cloud.bigquery.dataset.Condition.__hash__
__hash__() -> intGenerate a hash based on expression, title, and description.
google.cloud.bigquery.dataset.Condition.__ne__
__ne__(other: object) -> boolCheck for inequality.
google.cloud.bigquery.dataset.Condition.__repr__
__repr__() -> strReturn a string representation of the Condition object.
google.cloud.bigquery.dataset.Condition.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.dataset.ConditionFactory: construct a Condition instance given its API representation.
See more: google.cloud.bigquery.dataset.Condition.from_api_repr
google.cloud.bigquery.dataset.Condition.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this Condition.
See more: google.cloud.bigquery.dataset.Condition.to_api_repr
google.cloud.bigquery.dataset.Dataset.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.dataset.DatasetFactory: construct a dataset given its API representation .
See more: google.cloud.bigquery.dataset.Dataset.from_api_repr
google.cloud.bigquery.dataset.Dataset.from_string
from_string(full_dataset_id: str) -> google.cloud.bigquery.dataset.DatasetConstruct a dataset from fully-qualified dataset ID.
google.cloud.bigquery.dataset.Dataset.model
model(model_id)Constructs a ModelReference.
google.cloud.bigquery.dataset.Dataset.routine
routine(routine_id)Constructs a RoutineReference.
google.cloud.bigquery.dataset.Dataset.table
table(table_id: str) -> google.cloud.bigquery.table.TableReferenceConstructs a TableReference.
google.cloud.bigquery.dataset.Dataset.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this dataset .
google.cloud.bigquery.dataset.DatasetListItem.model
model(model_id)Constructs a ModelReference.
See more: google.cloud.bigquery.dataset.DatasetListItem.model
google.cloud.bigquery.dataset.DatasetListItem.routine
routine(routine_id)Constructs a RoutineReference.
See more: google.cloud.bigquery.dataset.DatasetListItem.routine
google.cloud.bigquery.dataset.DatasetListItem.table
table(table_id: str) -> google.cloud.bigquery.table.TableReferenceConstructs a TableReference.
See more: google.cloud.bigquery.dataset.DatasetListItem.table
google.cloud.bigquery.dataset.DatasetReference.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.dataset.DatasetReferenceFactory: construct a dataset reference given its API representation .
See more: google.cloud.bigquery.dataset.DatasetReference.from_api_repr
google.cloud.bigquery.dataset.DatasetReference.from_string
from_string(
dataset_id: str, default_project: typing.Optional[str] = None
) -> google.cloud.bigquery.dataset.DatasetReferenceConstruct a dataset reference from dataset ID string.
See more: google.cloud.bigquery.dataset.DatasetReference.from_string
google.cloud.bigquery.dataset.DatasetReference.model
model(model_id)Constructs a ModelReference.
See more: google.cloud.bigquery.dataset.DatasetReference.model
google.cloud.bigquery.dataset.DatasetReference.routine
routine(routine_id)Constructs a RoutineReference.
See more: google.cloud.bigquery.dataset.DatasetReference.routine
google.cloud.bigquery.dataset.DatasetReference.table
table(table_id: str) -> google.cloud.bigquery.table.TableReferenceConstructs a TableReference.
See more: google.cloud.bigquery.dataset.DatasetReference.table
google.cloud.bigquery.dataset.DatasetReference.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this dataset reference .
See more: google.cloud.bigquery.dataset.DatasetReference.to_api_repr
google.cloud.bigquery.dbapi.Connection.close
close()Close the connection and any cursors created from it.
google.cloud.bigquery.dbapi.Connection.commit
commit()No-op, but for consistency raise an error if connection is closed.
google.cloud.bigquery.dbapi.Connection.cursor
cursor()Return a new cursor object.
google.cloud.bigquery.dbapi.Cursor.close
close()Mark the cursor as closed, preventing its further use.
See more: google.cloud.bigquery.dbapi.Cursor.close
google.cloud.bigquery.dbapi.Cursor.execute
execute(operation, parameters=None, job_id=None, job_config=None)Prepare and execute a database operation.
google.cloud.bigquery.dbapi.Cursor.executemany
executemany(operation, seq_of_parameters)Prepare and execute a database operation multiple times.
google.cloud.bigquery.dbapi.Cursor.fetchall
fetchall()Fetch all remaining results from the last execute*() call.
google.cloud.bigquery.dbapi.Cursor.fetchmany
fetchmany(size=None)Fetch multiple results from the last execute*() call.
google.cloud.bigquery.dbapi.Cursor.fetchone
fetchone()Fetch a single row from the results of the last execute*() call.
google.cloud.bigquery.dbapi.Cursor.setinputsizes
setinputsizes(sizes)No-op, but for consistency raise an error if cursor is closed.
google.cloud.bigquery.dbapi.Cursor.setoutputsize
setoutputsize(size, column=None)No-op, but for consistency raise an error if cursor is closed.
google.cloud.bigquery.encryption_configuration.EncryptionConfiguration.from_api_repr
from_api_repr(resource)Construct an encryption configuration from its API representation .
See more: google.cloud.bigquery.encryption_configuration.EncryptionConfiguration.from_api_repr
google.cloud.bigquery.encryption_configuration.EncryptionConfiguration.to_api_repr
to_api_repr()Construct the API resource representation of this encryption configuration.
See more: google.cloud.bigquery.encryption_configuration.EncryptionConfiguration.to_api_repr
google.cloud.bigquery.external_config.BigtableColumn.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.external_config.BigtableColumnFactory: construct a .external_config.BigtableColumn
instance given its API representation.
See more: google.cloud.bigquery.external_config.BigtableColumn.from_api_repr
google.cloud.bigquery.external_config.BigtableColumn.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.BigtableColumn.to_api_repr
google.cloud.bigquery.external_config.BigtableColumnFamily.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.external_config.BigtableColumnFamilyFactory: construct a .external_config.BigtableColumnFamily
instance given its API representation.
See more: google.cloud.bigquery.external_config.BigtableColumnFamily.from_api_repr
google.cloud.bigquery.external_config.BigtableColumnFamily.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.BigtableColumnFamily.to_api_repr
google.cloud.bigquery.external_config.BigtableOptions.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.external_config.BigtableOptionsFactory: construct a .external_config.BigtableOptions
instance given its API representation.
See more: google.cloud.bigquery.external_config.BigtableOptions.from_api_repr
google.cloud.bigquery.external_config.BigtableOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.BigtableOptions.to_api_repr
google.cloud.bigquery.external_config.CSVOptions.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.external_config.CSVOptionsFactory: construct a .external_config.CSVOptions instance
given its API representation.
See more: google.cloud.bigquery.external_config.CSVOptions.from_api_repr
google.cloud.bigquery.external_config.CSVOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.CSVOptions.to_api_repr
google.cloud.bigquery.external_config.ExternalCatalogDatasetOptions.from_api_repr
from_api_repr(
api_repr: dict,
) -> google.cloud.bigquery.external_config.ExternalCatalogDatasetOptionsFactory: constructs an instance of the class (cls) given its API representation.
See more: google.cloud.bigquery.external_config.ExternalCatalogDatasetOptions.from_api_repr
google.cloud.bigquery.external_config.ExternalCatalogDatasetOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.ExternalCatalogDatasetOptions.to_api_repr
google.cloud.bigquery.external_config.ExternalCatalogTableOptions.from_api_repr
from_api_repr(
api_repr: dict,
) -> google.cloud.bigquery.external_config.ExternalCatalogTableOptionsFactory: constructs an instance of the class (cls) given its API representation.
See more: google.cloud.bigquery.external_config.ExternalCatalogTableOptions.from_api_repr
google.cloud.bigquery.external_config.ExternalCatalogTableOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.ExternalCatalogTableOptions.to_api_repr
google.cloud.bigquery.external_config.ExternalConfig.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.external_config.ExternalConfigFactory: construct an .external_config.ExternalConfig
instance given its API representation.
See more: google.cloud.bigquery.external_config.ExternalConfig.from_api_repr
google.cloud.bigquery.external_config.ExternalConfig.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.ExternalConfig.to_api_repr
google.cloud.bigquery.external_config.GoogleSheetsOptions.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.external_config.GoogleSheetsOptionsFactory: construct a .external_config.GoogleSheetsOptions
instance given its API representation.
See more: google.cloud.bigquery.external_config.GoogleSheetsOptions.from_api_repr
google.cloud.bigquery.external_config.GoogleSheetsOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.GoogleSheetsOptions.to_api_repr
google.cloud.bigquery.external_config.HivePartitioningOptions.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.external_config.HivePartitioningOptionsFactory: construct a .external_config.HivePartitioningOptions
instance given its API representation.
See more: google.cloud.bigquery.external_config.HivePartitioningOptions.from_api_repr
google.cloud.bigquery.external_config.HivePartitioningOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.external_config.HivePartitioningOptions.to_api_repr
google.cloud.bigquery.format_options.AvroOptions.from_api_repr
from_api_repr(
resource: typing.Dict[str, bool],
) -> google.cloud.bigquery.format_options.AvroOptionsFactory: construct an instance from a resource dict.
See more: google.cloud.bigquery.format_options.AvroOptions.from_api_repr
google.cloud.bigquery.format_options.AvroOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.format_options.AvroOptions.to_api_repr
google.cloud.bigquery.format_options.ParquetOptions.from_api_repr
from_api_repr(
resource: typing.Dict[str, bool],
) -> google.cloud.bigquery.format_options.ParquetOptionsFactory: construct an instance from a resource dict.
See more: google.cloud.bigquery.format_options.ParquetOptions.from_api_repr
google.cloud.bigquery.format_options.ParquetOptions.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.format_options.ParquetOptions.to_api_repr
google.cloud.bigquery.job.CopyJob.add_done_callback
add_done_callback(fn)Add a callback to be executed when the operation is complete.
See more: google.cloud.bigquery.job.CopyJob.add_done_callback
google.cloud.bigquery.job.CopyJob.cancel
cancel(
client=None,
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: cancel job via a POST request.
See more: google.cloud.bigquery.job.CopyJob.cancel
google.cloud.bigquery.job.CopyJob.cancelled
cancelled()Check if the job has been cancelled.
google.cloud.bigquery.job.CopyJob.done
done(
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
reload: bool = True,
) -> boolChecks if the job is complete.
See more: google.cloud.bigquery.job.CopyJob.done
google.cloud.bigquery.job.CopyJob.exception
exception(timeout=object)Get the exception from the operation, blocking if necessary.
google.cloud.bigquery.job.CopyJob.exists
exists(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: test for the existence of the job via a GET request.
See more: google.cloud.bigquery.job.CopyJob.exists
google.cloud.bigquery.job.CopyJob.from_api_repr
from_api_repr(resource, client)Factory: construct a job given its API representation.
google.cloud.bigquery.job.CopyJob.reload
reload(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
)API call: refresh job properties via a GET request.
See more: google.cloud.bigquery.job.CopyJob.reload
google.cloud.bigquery.job.CopyJob.result
result(
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.job.base._AsyncJobStart the job and wait for it to complete and get the result.
See more: google.cloud.bigquery.job.CopyJob.result
google.cloud.bigquery.job.CopyJob.running
running()True if the operation is currently running.
google.cloud.bigquery.job.CopyJob.set_exception
set_exception(exception)Set the Future's exception.
google.cloud.bigquery.job.CopyJob.set_result
set_result(result)Set the Future's result.
google.cloud.bigquery.job.CopyJob.to_api_repr
to_api_repr()Generate a resource for _begin.
google.cloud.bigquery.job.CopyJobConfig.__setattr__
__setattr__(name, value)Override to be able to raise error if an unknown property is being set.
google.cloud.bigquery.job.CopyJobConfig.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.job.base._JobConfigFactory: construct a job configuration given its API representation .
See more: google.cloud.bigquery.job.CopyJobConfig.from_api_repr
google.cloud.bigquery.job.CopyJobConfig.to_api_repr
to_api_repr() -> dictBuild an API representation of the job config.
See more: google.cloud.bigquery.job.CopyJobConfig.to_api_repr
google.cloud.bigquery.job.DmlStats
DmlStats(
inserted_row_count: int = 0, deleted_row_count: int = 0, updated_row_count: int = 0
)Create new instance of DmlStats(inserted_row_count, deleted_row_count, updated_row_count).
See more: google.cloud.bigquery.job.DmlStats
google.cloud.bigquery.job.DmlStats.count
count(value, /)Return number of occurrences of value.
See more: google.cloud.bigquery.job.DmlStats.count
google.cloud.bigquery.job.DmlStats.index
index(value, start=0, stop=9223372036854775807, /)Return first index of value.
See more: google.cloud.bigquery.job.DmlStats.index
google.cloud.bigquery.job.ExtractJob.add_done_callback
add_done_callback(fn)Add a callback to be executed when the operation is complete.
See more: google.cloud.bigquery.job.ExtractJob.add_done_callback
google.cloud.bigquery.job.ExtractJob.cancel
cancel(
client=None,
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: cancel job via a POST request.
google.cloud.bigquery.job.ExtractJob.cancelled
cancelled()Check if the job has been cancelled.
google.cloud.bigquery.job.ExtractJob.done
done(
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
reload: bool = True,
) -> boolChecks if the job is complete.
google.cloud.bigquery.job.ExtractJob.exception
exception(timeout=object)Get the exception from the operation, blocking if necessary.
google.cloud.bigquery.job.ExtractJob.exists
exists(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: test for the existence of the job via a GET request.
google.cloud.bigquery.job.ExtractJob.from_api_repr
from_api_repr(
resource: dict, client
) -> google.cloud.bigquery.job.extract.ExtractJobFactory: construct a job given its API representation.
See more: google.cloud.bigquery.job.ExtractJob.from_api_repr
google.cloud.bigquery.job.ExtractJob.reload
reload(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
)API call: refresh job properties via a GET request.
google.cloud.bigquery.job.ExtractJob.result
result(
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.job.base._AsyncJobStart the job and wait for it to complete and get the result.
google.cloud.bigquery.job.ExtractJob.running
running()True if the operation is currently running.
google.cloud.bigquery.job.ExtractJob.set_exception
set_exception(exception)Set the Future's exception.
See more: google.cloud.bigquery.job.ExtractJob.set_exception
google.cloud.bigquery.job.ExtractJob.set_result
set_result(result)Set the Future's result.
google.cloud.bigquery.job.ExtractJob.to_api_repr
to_api_repr()Generate a resource for _begin.
google.cloud.bigquery.job.ExtractJobConfig.__setattr__
__setattr__(name, value)Override to be able to raise error if an unknown property is being set.
See more: google.cloud.bigquery.job.ExtractJobConfig.setattr
google.cloud.bigquery.job.ExtractJobConfig.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.job.base._JobConfigFactory: construct a job configuration given its API representation .
See more: google.cloud.bigquery.job.ExtractJobConfig.from_api_repr
google.cloud.bigquery.job.ExtractJobConfig.to_api_repr
to_api_repr() -> dictBuild an API representation of the job config.
See more: google.cloud.bigquery.job.ExtractJobConfig.to_api_repr
google.cloud.bigquery.job.IncrementalResultStats.from_api_repr
from_api_repr(resource) -> google.cloud.bigquery.job.query.IncrementalResultStatsFactory: construct instance from the JSON repr.
See more: google.cloud.bigquery.job.IncrementalResultStats.from_api_repr
google.cloud.bigquery.job.LoadJob.add_done_callback
add_done_callback(fn)Add a callback to be executed when the operation is complete.
See more: google.cloud.bigquery.job.LoadJob.add_done_callback
google.cloud.bigquery.job.LoadJob.cancel
cancel(
client=None,
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: cancel job via a POST request.
See more: google.cloud.bigquery.job.LoadJob.cancel
google.cloud.bigquery.job.LoadJob.cancelled
cancelled()Check if the job has been cancelled.
google.cloud.bigquery.job.LoadJob.done
done(
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
reload: bool = True,
) -> boolChecks if the job is complete.
See more: google.cloud.bigquery.job.LoadJob.done
google.cloud.bigquery.job.LoadJob.exception
exception(timeout=object)Get the exception from the operation, blocking if necessary.
google.cloud.bigquery.job.LoadJob.exists
exists(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: test for the existence of the job via a GET request.
See more: google.cloud.bigquery.job.LoadJob.exists
google.cloud.bigquery.job.LoadJob.from_api_repr
from_api_repr(resource: dict, client) -> google.cloud.bigquery.job.load.LoadJobFactory: construct a job given its API representation.
google.cloud.bigquery.job.LoadJob.reload
reload(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
)API call: refresh job properties via a GET request.
See more: google.cloud.bigquery.job.LoadJob.reload
google.cloud.bigquery.job.LoadJob.result
result(
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.job.base._AsyncJobStart the job and wait for it to complete and get the result.
See more: google.cloud.bigquery.job.LoadJob.result
google.cloud.bigquery.job.LoadJob.running
running()True if the operation is currently running.
google.cloud.bigquery.job.LoadJob.set_exception
set_exception(exception)Set the Future's exception.
google.cloud.bigquery.job.LoadJob.set_result
set_result(result)Set the Future's result.
google.cloud.bigquery.job.LoadJob.to_api_repr
to_api_repr()Generate a resource for _begin.
google.cloud.bigquery.job.LoadJobConfig.__setattr__
__setattr__(name, value)Override to be able to raise error if an unknown property is being set.
google.cloud.bigquery.job.LoadJobConfig.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.job.base._JobConfigFactory: construct a job configuration given its API representation .
See more: google.cloud.bigquery.job.LoadJobConfig.from_api_repr
google.cloud.bigquery.job.LoadJobConfig.to_api_repr
to_api_repr() -> dictBuild an API representation of the job config.
See more: google.cloud.bigquery.job.LoadJobConfig.to_api_repr
google.cloud.bigquery.job.QueryJob.add_done_callback
add_done_callback(fn)Add a callback to be executed when the operation is complete.
See more: google.cloud.bigquery.job.QueryJob.add_done_callback
google.cloud.bigquery.job.QueryJob.cancel
cancel(
client=None,
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: cancel job via a POST request.
google.cloud.bigquery.job.QueryJob.cancelled
cancelled()Check if the job has been cancelled.
google.cloud.bigquery.job.QueryJob.done
done(
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
reload: bool = True,
) -> boolChecks if the job is complete.
See more: google.cloud.bigquery.job.QueryJob.done
google.cloud.bigquery.job.QueryJob.exception
exception(timeout=object)Get the exception from the operation, blocking if necessary.
google.cloud.bigquery.job.QueryJob.exists
exists(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: test for the existence of the job via a GET request.
google.cloud.bigquery.job.QueryJob.from_api_repr
from_api_repr(resource: dict, client: Client) -> QueryJobFactory: construct a job given its API representation .
google.cloud.bigquery.job.QueryJob.reload
reload(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
)API call: refresh job properties via a GET request.
google.cloud.bigquery.job.QueryJob.result
result(
page_size: typing.Optional[int] = None,
max_results: typing.Optional[int] = None,
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[typing.Union[float, object]] = object,
start_index: typing.Optional[int] = None,
job_retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
) -> typing.Union[RowIterator, google.cloud.bigquery.table._EmptyRowIterator]Start the job and wait for it to complete and get the result.
google.cloud.bigquery.job.QueryJob.running
running()True if the operation is currently running.
google.cloud.bigquery.job.QueryJob.set_exception
set_exception(exception)Set the Future's exception.
google.cloud.bigquery.job.QueryJob.set_result
set_result(result)Set the Future's result.
google.cloud.bigquery.job.QueryJob.to_api_repr
to_api_repr()Generate a resource for _begin.
google.cloud.bigquery.job.QueryJob.to_arrow
to_arrow(
progress_bar_type: typing.Optional[str] = None,
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
create_bqstorage_client: bool = True,
max_results: typing.Optional[int] = None,
) -> pyarrow.Table[Beta] Create a class:pyarrow.Table by loading all pages of a
table or query.
google.cloud.bigquery.job.QueryJob.to_dataframe
to_dataframe(
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
dtypes: typing.Optional[typing.Dict[str, typing.Any]] = None,
progress_bar_type: typing.Optional[str] = None,
create_bqstorage_client: bool = True,
max_results: typing.Optional[int] = None,
geography_as_object: bool = False,
bool_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.BOOL_DTYPE,
int_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.INT_DTYPE,
float_dtype: typing.Optional[typing.Any] = None,
string_dtype: typing.Optional[typing.Any] = None,
date_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.DATE_DTYPE,
datetime_dtype: typing.Optional[typing.Any] = None,
time_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.TIME_DTYPE,
timestamp_dtype: typing.Optional[typing.Any] = None,
range_date_dtype: typing.Optional[
typing.Any
] = DefaultPandasDTypes.RANGE_DATE_DTYPE,
range_datetime_dtype: typing.Optional[
typing.Any
] = DefaultPandasDTypes.RANGE_DATETIME_DTYPE,
range_timestamp_dtype: typing.Optional[
typing.Any
] = DefaultPandasDTypes.RANGE_TIMESTAMP_DTYPE,
) -> pandas.DataFrameReturn a pandas DataFrame from a QueryJob .
google.cloud.bigquery.job.QueryJob.to_geodataframe
to_geodataframe(
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
dtypes: typing.Optional[typing.Dict[str, typing.Any]] = None,
progress_bar_type: typing.Optional[str] = None,
create_bqstorage_client: bool = True,
max_results: typing.Optional[int] = None,
geography_column: typing.Optional[str] = None,
bool_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.BOOL_DTYPE,
int_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.INT_DTYPE,
float_dtype: typing.Optional[typing.Any] = None,
string_dtype: typing.Optional[typing.Any] = None,
) -> geopandas.GeoDataFrameReturn a GeoPandas GeoDataFrame from a QueryJob .
See more: google.cloud.bigquery.job.QueryJob.to_geodataframe
google.cloud.bigquery.job.QueryJobConfig.__setattr__
__setattr__(name, value)Override to be able to raise error if an unknown property is being set.
google.cloud.bigquery.job.QueryJobConfig.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.job.base._JobConfigFactory: construct a job configuration given its API representation .
See more: google.cloud.bigquery.job.QueryJobConfig.from_api_repr
google.cloud.bigquery.job.QueryJobConfig.to_api_repr
to_api_repr() -> dictBuild an API representation of the query job config.
See more: google.cloud.bigquery.job.QueryJobConfig.to_api_repr
google.cloud.bigquery.job.QueryPlanEntry.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.job.query.QueryPlanEntryFactory: construct instance from the JSON repr.
See more: google.cloud.bigquery.job.QueryPlanEntry.from_api_repr
google.cloud.bigquery.job.QueryPlanEntryStep.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.job.query.QueryPlanEntryStepFactory: construct instance from the JSON repr.
See more: google.cloud.bigquery.job.QueryPlanEntryStep.from_api_repr
google.cloud.bigquery.job.ReservationUsage
ReservationUsage(name, slot_ms)Create new instance of ReservationUsage(name, slot_ms).
google.cloud.bigquery.job.ReservationUsage.count
count(value, /)Return number of occurrences of value.
google.cloud.bigquery.job.ReservationUsage.index
index(value, start=0, stop=9223372036854775807, /)Return first index of value.
google.cloud.bigquery.job.ScriptOptions.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.job.query.ScriptOptionsFactory: construct instance from the JSON repr.
See more: google.cloud.bigquery.job.ScriptOptions.from_api_repr
google.cloud.bigquery.job.ScriptOptions.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation.
See more: google.cloud.bigquery.job.ScriptOptions.to_api_repr
google.cloud.bigquery.job.TimelineEntry.from_api_repr
from_api_repr(resource)Factory: construct instance from the JSON repr.
See more: google.cloud.bigquery.job.TimelineEntry.from_api_repr
google.cloud.bigquery.job.TransactionInfo
TransactionInfo(transaction_id: str)Create new instance of TransactionInfo(transaction_id,).
google.cloud.bigquery.job.TransactionInfo.count
count(value, /)Return number of occurrences of value.
google.cloud.bigquery.job.TransactionInfo.index
index(value, start=0, stop=9223372036854775807, /)Return first index of value.
google.cloud.bigquery.job.UnknownJob.add_done_callback
add_done_callback(fn)Add a callback to be executed when the operation is complete.
See more: google.cloud.bigquery.job.UnknownJob.add_done_callback
google.cloud.bigquery.job.UnknownJob.cancel
cancel(
client=None,
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: cancel job via a POST request.
google.cloud.bigquery.job.UnknownJob.cancelled
cancelled()Check if the job has been cancelled.
google.cloud.bigquery.job.UnknownJob.done
done(
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
reload: bool = True,
) -> boolChecks if the job is complete.
google.cloud.bigquery.job.UnknownJob.exception
exception(timeout=object)Get the exception from the operation, blocking if necessary.
google.cloud.bigquery.job.UnknownJob.exists
exists(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> boolAPI call: test for the existence of the job via a GET request.
google.cloud.bigquery.job.UnknownJob.from_api_repr
from_api_repr(resource: dict, client) -> google.cloud.bigquery.job.base.UnknownJobConstruct an UnknownJob from the JSON representation.
See more: google.cloud.bigquery.job.UnknownJob.from_api_repr
google.cloud.bigquery.job.UnknownJob.reload
reload(
client=None,
retry: google.api_core.retry.retry_unary.Retry = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = 128,
)API call: refresh job properties via a GET request.
google.cloud.bigquery.job.UnknownJob.result
result(
retry: typing.Optional[
google.api_core.retry.retry_unary.Retry
] = google.api_core.retry.retry_unary.Retry,
timeout: typing.Optional[float] = None,
) -> google.cloud.bigquery.job.base._AsyncJobStart the job and wait for it to complete and get the result.
google.cloud.bigquery.job.UnknownJob.running
running()True if the operation is currently running.
google.cloud.bigquery.job.UnknownJob.set_exception
set_exception(exception)Set the Future's exception.
See more: google.cloud.bigquery.job.UnknownJob.set_exception
google.cloud.bigquery.job.UnknownJob.set_result
set_result(result)Set the Future's result.
google.cloud.bigquery.job.UnknownJob.to_api_repr
to_api_repr()Generate a resource for the job.
google.cloud.bigquery.job.base.ReservationUsage
ReservationUsage(name, slot_ms)Create new instance of ReservationUsage(name, slot_ms).
google.cloud.bigquery.job.base.TransactionInfo
TransactionInfo(transaction_id: str)Create new instance of TransactionInfo(transaction_id,).
google.cloud.bigquery.job.base.UnknownJob.from_api_repr
from_api_repr(resource: dict, client) -> google.cloud.bigquery.job.base.UnknownJobConstruct an UnknownJob from the JSON representation.
See more: google.cloud.bigquery.job.base.UnknownJob.from_api_repr
google.cloud.bigquery.model.Model.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.model.ModelFactory: construct a model resource given its API representation .
google.cloud.bigquery.model.Model.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this model.
google.cloud.bigquery.model.ModelReference.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.model.ModelReferenceFactory: construct a model reference given its API representation.
See more: google.cloud.bigquery.model.ModelReference.from_api_repr
google.cloud.bigquery.model.ModelReference.from_string
from_string(
model_id: str, default_project: typing.Optional[str] = None
) -> google.cloud.bigquery.model.ModelReferenceConstruct a model reference from model ID string.
See more: google.cloud.bigquery.model.ModelReference.from_string
google.cloud.bigquery.model.ModelReference.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this model reference.
See more: google.cloud.bigquery.model.ModelReference.to_api_repr
google.cloud.bigquery.model.TransformColumn.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.model.TransformColumnConstructs a transform column feature given its API representation .
See more: google.cloud.bigquery.model.TransformColumn.from_api_repr
google.cloud.bigquery.query.ArrayQueryParameter.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.query.ArrayQueryParameterFactory: construct parameter from JSON resource.
See more: google.cloud.bigquery.query.ArrayQueryParameter.from_api_repr
google.cloud.bigquery.query.ArrayQueryParameter.positional
positional(
array_type: str, values: list
) -> google.cloud.bigquery.query.ArrayQueryParameterFactory for positional parameters.
See more: google.cloud.bigquery.query.ArrayQueryParameter.positional
google.cloud.bigquery.query.ArrayQueryParameter.to_api_repr
to_api_repr() -> dictConstruct JSON API representation for the parameter.
See more: google.cloud.bigquery.query.ArrayQueryParameter.to_api_repr
google.cloud.bigquery.query.ArrayQueryParameterType.from_api_repr
from_api_repr(resource)Factory: construct parameter type from JSON resource.
See more: google.cloud.bigquery.query.ArrayQueryParameterType.from_api_repr
google.cloud.bigquery.query.ArrayQueryParameterType.to_api_repr
to_api_repr()Construct JSON API representation for the parameter type.
See more: google.cloud.bigquery.query.ArrayQueryParameterType.to_api_repr
google.cloud.bigquery.query.ConnectionProperty.from_api_repr
from_api_repr(resource) -> google.cloud.bigquery.query.ConnectionPropertyConstruct xref_ConnectionProperty from JSON resource.
See more: google.cloud.bigquery.query.ConnectionProperty.from_api_repr
google.cloud.bigquery.query.ConnectionProperty.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct JSON API representation for the connection property.
See more: google.cloud.bigquery.query.ConnectionProperty.to_api_repr
google.cloud.bigquery.query.RangeQueryParameter.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.query.RangeQueryParameterFactory: construct parameter from JSON resource.
See more: google.cloud.bigquery.query.RangeQueryParameter.from_api_repr
google.cloud.bigquery.query.RangeQueryParameter.positional
positional(
range_element_type, start=None, end=None
) -> google.cloud.bigquery.query.RangeQueryParameterFactory for positional parameters.
See more: google.cloud.bigquery.query.RangeQueryParameter.positional
google.cloud.bigquery.query.RangeQueryParameter.to_api_repr
to_api_repr() -> dictConstruct JSON API representation for the parameter.
See more: google.cloud.bigquery.query.RangeQueryParameter.to_api_repr
google.cloud.bigquery.query.RangeQueryParameterType.from_api_repr
from_api_repr(resource)Factory: construct parameter type from JSON resource.
See more: google.cloud.bigquery.query.RangeQueryParameterType.from_api_repr
google.cloud.bigquery.query.RangeQueryParameterType.to_api_repr
to_api_repr()Construct JSON API representation for the parameter type.
See more: google.cloud.bigquery.query.RangeQueryParameterType.to_api_repr
google.cloud.bigquery.query.RangeQueryParameterType.with_name
with_name(new_name: typing.Optional[str])Return a copy of the instance with name set to new_name.
See more: google.cloud.bigquery.query.RangeQueryParameterType.with_name
google.cloud.bigquery.query.ScalarQueryParameter.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.query.ScalarQueryParameterFactory: construct parameter from JSON resource.
See more: google.cloud.bigquery.query.ScalarQueryParameter.from_api_repr
google.cloud.bigquery.query.ScalarQueryParameter.positional
positional(
type_: typing.Union[str, google.cloud.bigquery.query.ScalarQueryParameterType],
value: typing.Optional[
typing.Union[
str, int, float, decimal.Decimal, bool, datetime.datetime, datetime.date
]
],
) -> google.cloud.bigquery.query.ScalarQueryParameterFactory for positional paramater.
See more: google.cloud.bigquery.query.ScalarQueryParameter.positional
google.cloud.bigquery.query.ScalarQueryParameter.to_api_repr
to_api_repr() -> dictConstruct JSON API representation for the parameter.
See more: google.cloud.bigquery.query.ScalarQueryParameter.to_api_repr
google.cloud.bigquery.query.ScalarQueryParameterType.from_api_repr
from_api_repr(resource)Factory: construct parameter type from JSON resource.
See more: google.cloud.bigquery.query.ScalarQueryParameterType.from_api_repr
google.cloud.bigquery.query.ScalarQueryParameterType.to_api_repr
to_api_repr()Construct JSON API representation for the parameter type.
See more: google.cloud.bigquery.query.ScalarQueryParameterType.to_api_repr
google.cloud.bigquery.query.ScalarQueryParameterType.with_name
with_name(new_name: typing.Optional[str])Return a copy of the instance with name set to new_name.
See more: google.cloud.bigquery.query.ScalarQueryParameterType.with_name
google.cloud.bigquery.query.StructQueryParameter.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.query.StructQueryParameterFactory: construct parameter from JSON resource.
See more: google.cloud.bigquery.query.StructQueryParameter.from_api_repr
google.cloud.bigquery.query.StructQueryParameter.positional
positional(*sub_params)Factory for positional parameters.
See more: google.cloud.bigquery.query.StructQueryParameter.positional
google.cloud.bigquery.query.StructQueryParameter.to_api_repr
to_api_repr() -> dictConstruct JSON API representation for the parameter.
See more: google.cloud.bigquery.query.StructQueryParameter.to_api_repr
google.cloud.bigquery.query.StructQueryParameterType.from_api_repr
from_api_repr(resource)Factory: construct parameter type from JSON resource.
See more: google.cloud.bigquery.query.StructQueryParameterType.from_api_repr
google.cloud.bigquery.query.StructQueryParameterType.to_api_repr
to_api_repr()Construct JSON API representation for the parameter type.
See more: google.cloud.bigquery.query.StructQueryParameterType.to_api_repr
google.cloud.bigquery.routine.ExternalRuntimeOptions.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.routine.routine.ExternalRuntimeOptionsFactory: construct external runtime options given its API representation.
See more: google.cloud.bigquery.routine.ExternalRuntimeOptions.from_api_repr
google.cloud.bigquery.routine.ExternalRuntimeOptions.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this ExternalRuntimeOptions.
See more: google.cloud.bigquery.routine.ExternalRuntimeOptions.to_api_repr
google.cloud.bigquery.routine.RemoteFunctionOptions.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.routine.routine.RemoteFunctionOptionsFactory: construct remote function options given its API representation.
See more: google.cloud.bigquery.routine.RemoteFunctionOptions.from_api_repr
google.cloud.bigquery.routine.RemoteFunctionOptions.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this RemoteFunctionOptions.
See more: google.cloud.bigquery.routine.RemoteFunctionOptions.to_api_repr
google.cloud.bigquery.routine.Routine.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.routine.routine.RoutineFactory: construct a routine given its API representation.
See more: google.cloud.bigquery.routine.Routine.from_api_repr
google.cloud.bigquery.routine.Routine.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this routine.
google.cloud.bigquery.routine.RoutineArgument.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.routine.routine.RoutineArgumentFactory: construct a routine argument given its API representation.
See more: google.cloud.bigquery.routine.RoutineArgument.from_api_repr
google.cloud.bigquery.routine.RoutineArgument.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this routine argument.
See more: google.cloud.bigquery.routine.RoutineArgument.to_api_repr
google.cloud.bigquery.routine.RoutineReference.__eq__
__eq__(other)Two RoutineReferences are equal if they point to the same routine.
google.cloud.bigquery.routine.RoutineReference.__str__
__str__()String representation of the reference.
See more: google.cloud.bigquery.routine.RoutineReference.str
google.cloud.bigquery.routine.RoutineReference.from_api_repr
from_api_repr(
resource: dict,
) -> google.cloud.bigquery.routine.routine.RoutineReferenceFactory: construct a routine reference given its API representation.
See more: google.cloud.bigquery.routine.RoutineReference.from_api_repr
google.cloud.bigquery.routine.RoutineReference.from_string
from_string(
routine_id: str, default_project: typing.Optional[str] = None
) -> google.cloud.bigquery.routine.routine.RoutineReferenceFactory: construct a routine reference from routine ID string.
See more: google.cloud.bigquery.routine.RoutineReference.from_string
google.cloud.bigquery.routine.RoutineReference.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this routine reference.
See more: google.cloud.bigquery.routine.RoutineReference.to_api_repr
google.cloud.bigquery.schema.FieldElementType.from_api_repr
from_api_repr(
api_repr: typing.Optional[dict],
) -> typing.Optional[google.cloud.bigquery.schema.FieldElementType]Factory: construct a FieldElementType given its API representation.
See more: google.cloud.bigquery.schema.FieldElementType.from_api_repr
google.cloud.bigquery.schema.FieldElementType.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this field element type.
See more: google.cloud.bigquery.schema.FieldElementType.to_api_repr
google.cloud.bigquery.schema.ForeignTypeInfo.from_api_repr
from_api_repr(
api_repr: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.schema.ForeignTypeInfoFactory: constructs an instance of the class (cls) given its API representation.
See more: google.cloud.bigquery.schema.ForeignTypeInfo.from_api_repr
google.cloud.bigquery.schema.ForeignTypeInfo.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.schema.ForeignTypeInfo.to_api_repr
google.cloud.bigquery.schema.PolicyTagList.from_api_repr
from_api_repr(api_repr: dict) -> google.cloud.bigquery.schema.PolicyTagListReturn a PolicyTagList object deserialized from a dict.
See more: google.cloud.bigquery.schema.PolicyTagList.from_api_repr
google.cloud.bigquery.schema.PolicyTagList.to_api_repr
to_api_repr() -> dictReturn a dictionary representing this object.
See more: google.cloud.bigquery.schema.PolicyTagList.to_api_repr
google.cloud.bigquery.schema.SchemaField.from_api_repr
from_api_repr(api_repr: dict) -> google.cloud.bigquery.schema.SchemaFieldReturn a SchemaField object deserialized from a dictionary.
See more: google.cloud.bigquery.schema.SchemaField.from_api_repr
google.cloud.bigquery.schema.SchemaField.to_api_repr
to_api_repr() -> dictReturn a dictionary representing this schema field.
See more: google.cloud.bigquery.schema.SchemaField.to_api_repr
google.cloud.bigquery.schema.SchemaField.to_standard_sql
to_standard_sql() -> google.cloud.bigquery.standard_sql.StandardSqlFieldReturn the field as the standard SQL field representation object.
See more: google.cloud.bigquery.schema.SchemaField.to_standard_sql
google.cloud.bigquery.schema.SerDeInfo.from_api_repr
from_api_repr(api_repr: dict) -> google.cloud.bigquery.schema.SerDeInfoFactory: constructs an instance of the class (cls) given its API representation.
See more: google.cloud.bigquery.schema.SerDeInfo.from_api_repr
google.cloud.bigquery.schema.SerDeInfo.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.schema.SerDeInfo.to_api_repr
google.cloud.bigquery.schema.StorageDescriptor.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.schema.StorageDescriptorFactory: constructs an instance of the class (cls) given its API representation.
See more: google.cloud.bigquery.schema.StorageDescriptor.from_api_repr
google.cloud.bigquery.schema.StorageDescriptor.to_api_repr
to_api_repr() -> dictBuild an API representation of this object.
See more: google.cloud.bigquery.schema.StorageDescriptor.to_api_repr
google.cloud.bigquery.standard_sql.StandardSqlDataType.from_api_repr
from_api_repr(resource: typing.Dict[str, typing.Any])Construct an SQL data type instance given its API representation.
See more: google.cloud.bigquery.standard_sql.StandardSqlDataType.from_api_repr
google.cloud.bigquery.standard_sql.StandardSqlDataType.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this SQL data type.
See more: google.cloud.bigquery.standard_sql.StandardSqlDataType.to_api_repr
google.cloud.bigquery.standard_sql.StandardSqlField.from_api_repr
from_api_repr(resource: typing.Dict[str, typing.Any])Construct an SQL field instance given its API representation.
See more: google.cloud.bigquery.standard_sql.StandardSqlField.from_api_repr
google.cloud.bigquery.standard_sql.StandardSqlField.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this SQL field.
See more: google.cloud.bigquery.standard_sql.StandardSqlField.to_api_repr
google.cloud.bigquery.standard_sql.StandardSqlStructType.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.standard_sql.StandardSqlStructTypeConstruct an SQL struct type instance given its API representation.
See more: google.cloud.bigquery.standard_sql.StandardSqlStructType.from_api_repr
google.cloud.bigquery.standard_sql.StandardSqlStructType.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this SQL struct type.
See more: google.cloud.bigquery.standard_sql.StandardSqlStructType.to_api_repr
google.cloud.bigquery.standard_sql.StandardSqlTableType.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.standard_sql.StandardSqlTableTypeConstruct an SQL table type instance given its API representation.
See more: google.cloud.bigquery.standard_sql.StandardSqlTableType.from_api_repr
google.cloud.bigquery.standard_sql.StandardSqlTableType.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this SQL table type.
See more: google.cloud.bigquery.standard_sql.StandardSqlTableType.to_api_repr
google.cloud.bigquery.table.BigLakeConfiguration.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.table.BigLakeConfigurationFactory: construct a BigLakeConfiguration given its API representation.
See more: google.cloud.bigquery.table.BigLakeConfiguration.from_api_repr
google.cloud.bigquery.table.BigLakeConfiguration.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Construct the API resource representation of this BigLakeConfiguration.
See more: google.cloud.bigquery.table.BigLakeConfiguration.to_api_repr
google.cloud.bigquery.table.ForeignKey.from_api_repr
from_api_repr(
api_repr: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.table.ForeignKeyCreate an instance from API representation.
See more: google.cloud.bigquery.table.ForeignKey.from_api_repr
google.cloud.bigquery.table.ForeignKey.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Return a dictionary representing this object.
See more: google.cloud.bigquery.table.ForeignKey.to_api_repr
google.cloud.bigquery.table.Row.get
get(key: str, default: typing.Optional[typing.Any] = None) -> typing.AnyReturn a value for key, with a default value if it does not exist.
See more: google.cloud.bigquery.table.Row.get
google.cloud.bigquery.table.Row.items
items() -> typing.Iterable[typing.Tuple[str, typing.Any]]Return items as (key, value) pairs.
See more: google.cloud.bigquery.table.Row.items
google.cloud.bigquery.table.Row.keys
keys() -> typing.Iterable[str]Return the keys for using a row as a dict.
See more: google.cloud.bigquery.table.Row.keys
google.cloud.bigquery.table.Row.values
values()Return the values included in this row.
See more: google.cloud.bigquery.table.Row.values
google.cloud.bigquery.table.RowIterator.to_arrow
to_arrow(
progress_bar_type: typing.Optional[str] = None,
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
create_bqstorage_client: bool = True,
) -> pyarrow.Table[Beta] Create a class:pyarrow.Table by loading all pages of a
table or query.
google.cloud.bigquery.table.RowIterator.to_arrow_iterable
to_arrow_iterable(
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
max_queue_size: int = object,
max_stream_count: typing.Optional[int] = None,
) -> typing.Iterator[pyarrow.RecordBatch][Beta] Create an iterable of class:pyarrow.RecordBatch, to process the table as a stream.
See more: google.cloud.bigquery.table.RowIterator.to_arrow_iterable
google.cloud.bigquery.table.RowIterator.to_dataframe
to_dataframe(
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
dtypes: typing.Optional[typing.Dict[str, typing.Any]] = None,
progress_bar_type: typing.Optional[str] = None,
create_bqstorage_client: bool = True,
geography_as_object: bool = False,
bool_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.BOOL_DTYPE,
int_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.INT_DTYPE,
float_dtype: typing.Optional[typing.Any] = None,
string_dtype: typing.Optional[typing.Any] = None,
date_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.DATE_DTYPE,
datetime_dtype: typing.Optional[typing.Any] = None,
time_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.TIME_DTYPE,
timestamp_dtype: typing.Optional[typing.Any] = None,
range_date_dtype: typing.Optional[
typing.Any
] = DefaultPandasDTypes.RANGE_DATE_DTYPE,
range_datetime_dtype: typing.Optional[
typing.Any
] = DefaultPandasDTypes.RANGE_DATETIME_DTYPE,
range_timestamp_dtype: typing.Optional[
typing.Any
] = DefaultPandasDTypes.RANGE_TIMESTAMP_DTYPE,
) -> pandas.DataFrameCreate a pandas DataFrame by loading all pages of a query.
See more: google.cloud.bigquery.table.RowIterator.to_dataframe
google.cloud.bigquery.table.RowIterator.to_dataframe_iterable
to_dataframe_iterable(
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
dtypes: typing.Optional[typing.Dict[str, typing.Any]] = None,
max_queue_size: int = object,
max_stream_count: typing.Optional[int] = None,
) -> pandas.DataFrameCreate an iterable of pandas DataFrames, to process the table as a stream.
See more: google.cloud.bigquery.table.RowIterator.to_dataframe_iterable
google.cloud.bigquery.table.RowIterator.to_geodataframe
to_geodataframe(
bqstorage_client: typing.Optional[bigquery_storage.BigQueryReadClient] = None,
dtypes: typing.Optional[typing.Dict[str, typing.Any]] = None,
progress_bar_type: typing.Optional[str] = None,
create_bqstorage_client: bool = True,
geography_column: typing.Optional[str] = None,
bool_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.BOOL_DTYPE,
int_dtype: typing.Optional[typing.Any] = DefaultPandasDTypes.INT_DTYPE,
float_dtype: typing.Optional[typing.Any] = None,
string_dtype: typing.Optional[typing.Any] = None,
) -> geopandas.GeoDataFrameCreate a GeoPandas GeoDataFrame by loading all pages of a query.
See more: google.cloud.bigquery.table.RowIterator.to_geodataframe
google.cloud.bigquery.table.Table.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.table.TableFactory: construct a table given its API representation .
google.cloud.bigquery.table.Table.from_string
from_string(full_table_id: str) -> google.cloud.bigquery.table.TableConstruct a table from fully-qualified table ID.
google.cloud.bigquery.table.Table.to_api_repr
to_api_repr() -> dictConstructs the API resource of this table .
google.cloud.bigquery.table.Table.to_bqstorage
to_bqstorage() -> strConstruct a BigQuery Storage API representation of this table.
google.cloud.bigquery.table.TableConstraints.from_api_repr
from_api_repr(
resource: typing.Dict[str, typing.Any],
) -> google.cloud.bigquery.table.TableConstraintsCreate an instance from API representation.
See more: google.cloud.bigquery.table.TableConstraints.from_api_repr
google.cloud.bigquery.table.TableConstraints.to_api_repr
to_api_repr() -> typing.Dict[str, typing.Any]Return a dictionary representing this object.
See more: google.cloud.bigquery.table.TableConstraints.to_api_repr
google.cloud.bigquery.table.TableListItem.from_string
from_string(full_table_id: str) -> google.cloud.bigquery.table.TableListItemConstruct a table from fully-qualified table ID.
See more: google.cloud.bigquery.table.TableListItem.from_string
google.cloud.bigquery.table.TableListItem.to_api_repr
to_api_repr() -> dictConstructs the API resource of this table .
See more: google.cloud.bigquery.table.TableListItem.to_api_repr
google.cloud.bigquery.table.TableListItem.to_bqstorage
to_bqstorage() -> strConstruct a BigQuery Storage API representation of this table.
See more: google.cloud.bigquery.table.TableListItem.to_bqstorage
google.cloud.bigquery.table.TableReference.from_api_repr
from_api_repr(resource: dict) -> google.cloud.bigquery.table.TableReferenceFactory: construct a table reference given its API representation .
See more: google.cloud.bigquery.table.TableReference.from_api_repr
google.cloud.bigquery.table.TableReference.from_string
from_string(
table_id: str, default_project: typing.Optional[str] = None
) -> google.cloud.bigquery.table.TableReferenceConstruct a table reference from table ID string.
See more: google.cloud.bigquery.table.TableReference.from_string
google.cloud.bigquery.table.TableReference.to_api_repr
to_api_repr() -> dictConstruct the API resource representation of this table reference.
See more: google.cloud.bigquery.table.TableReference.to_api_repr
google.cloud.bigquery.table.TableReference.to_bqstorage
to_bqstorage() -> strConstruct a BigQuery Storage API representation of this table.
See more: google.cloud.bigquery.table.TableReference.to_bqstorage
google.cloud.bigquery.table.TimePartitioning.from_api_repr
from_api_repr(api_repr: dict) -> google.cloud.bigquery.table.TimePartitioningReturn a TimePartitioning object deserialized from a dict.
See more: google.cloud.bigquery.table.TimePartitioning.from_api_repr
google.cloud.bigquery.table.TimePartitioning.to_api_repr
to_api_repr() -> dictReturn a dictionary representing this object.
See more: google.cloud.bigquery.table.TimePartitioning.to_api_repr