@@ -20,24 +20,121 @@ This simple command-line application demonstrates how to invoke Google BigQuery
20204 . ** Install dependencies** via [ Composer] ( http://getcomposer.org/doc/00-intro.md ) .
2121 Run ` php composer.phar install ` (if composer is installed locally) or ` composer install `
2222 (if composer is installed globally).
23- 5 . Run ` php main.php YOUR_PROJECT_NAME ` where YOUR_PROJECT_NAME is the
24- project associated with the credentials from ** step 2** .
23+ 5 . Run ` php bigquery.php ` . The following commands are available:
2524
2625 ``` sh
27- $ php main.php my-project-name
28-
29- Query Results:
30- ------------
31- hamlet 5318
32- kinghenryv 5104
33- cymbeline 4875
34- troilusandcressida 4795
35- kinglear 4784
36- kingrichardiii 4713
37- 2kinghenryvi 4683
38- coriolanus 4653
39- 2kinghenryiv 4605
40- antonyandcleopatra 4582
26+ datasets List BigQuery datasets for a project
27+ import Import data into a BigQuery table
28+ query Run a BigQuery query
29+ schema Create or delete a table schema in BigQuery
30+ ```
31+
32+ ## The commands
33+
34+ ### datasets
35+
36+ List the datasets for your BigQuery project.
37+
38+ ``` sh
39+ $ php bigquery.php datasets
40+ test_dataset1
41+ test_dataset2
42+ test_dataset3
43+ ```
44+
45+ ### import
46+
47+ Import data into a BigQuery table. You can import from several sources.
48+
49+ 1 . Import from a local JSON or CSV file. Make sure your files are
50+ [ formatted correctly] ( https://cloud.google.com/bigquery/loading-data#specifying_the_source_format )
51+
52+ ``` sh
53+ $ php bigquery.php import test_dataset.test_table /path/to/your_data.csv
54+ $ php bigquery.php import test_dataset.test_table /path/to/your_data.json
55+ ```
56+ 1 . Import from [ a JSON or CSV file in Google Cloud Storage] ( https://cloud.google.com/bigquery/docs/loading-data-cloud-storage )
57+
58+ ``` sh
59+ $ php bigquery.php import test_dataset.test_table gs://your-storage-bucket/your_data.csv
60+ $ php bigquery.php import test_dataset.test_table gs://your-storage-bucket/your_data.json
61+ ```
62+ 1 . Import from a [ Datastore Backup] ( https://cloud.google.com/bigquery/loading-data-cloud-datastore )
63+
64+ ``` sh
65+ $ php bigquery.php import test_dataset.test_table gs://your-storage-bucket/your_data.backup_info
66+ ```
67+
68+ You can also [ stream data into bigquery] ( https://cloud.google.com/bigquery/streaming-data-into-bigquery )
69+ one record at a time. This approach enables querying data without the delay of running a load job:
70+
71+ ``` sh
72+ $ php bigquery.php import test_dataset.test_table
73+ Import data for project cloud-samples-tests-php? [y/n]: y
74+ name (required): Brent Shaffer
75+ title (required): PHP Developer
76+ Data streamed into BigQuery successfully
77+ ```
78+
79+ ### query
80+
81+ Run a BigQuery query
82+
83+ ``` sh
84+ $ php bigquery.php query " SELECT TOP(corpus, 3) as title, COUNT(*) as unique_words FROM [publicdata:samples.shakespeare]"
85+ --- Row 1 ---
86+ title: hamlet
87+ unique_words: 5318
88+ --- Row 2 ---
89+ title: kinghenryv
90+ unique_words: 5104
91+ --- Row 3 ---
92+ title: cymbeline
93+ unique_words: 4875
94+ ```
95+
96+ ### schema
97+
98+ Create a table schema in BigQuery. If a schema file is not supplied, you can
99+ create a schema interactively.
100+
101+ ``` sh
102+ $ php bigquery.php schema my_dataset.my_table --project your-project-id
103+ Using project your-project-id
104+ 1st column name: name
105+ 1st column type (default: string):
106+ [0] string
107+ [1] bytes
108+ [2] integer
109+ [3] float
110+ [4] boolean
111+ [5] timestamp
112+ [6] date
113+ [7] record
114+ > 0
115+ 1st column mode (default: nullable):
116+ [0] nullable
117+ [1] required
118+ [2] repeated
119+ > 1
120+ add another field? [y/n]: n
121+ [
122+ {
123+ " name" : " name" ,
124+ " type" : " string" ,
125+ " mode" : " required"
126+ }
127+ ]
128+ Does this schema look correct? [y/n]: y
129+ Table created successfully
130+ ```
131+
132+ The schema command also allows the deletion of tables
133+ ``` sh
134+ $ php bigquery.php schema my_dataset.my_table --project your-project-id --delete
135+ Using project your-project-id
136+ Are you sure you want to delete the BigQuery table " my_table" ? [y/n]: y
137+ Table deleted successfully
41138```
42139
43140## Contributing changes
@@ -47,5 +144,3 @@ This simple command-line application demonstrates how to invoke Google BigQuery
47144## Licensing
48145
49146* See [ LICENSE] ( ../../LICENSE )
50-
51-
0 commit comments