Skip to content

Commit 4f32535

Browse files
derekjchownealwu
authored andcommitted
Update docs in object_detection to reflect new path.` (tensorflow#2434)
1 parent fa32bb5 commit 4f32535

File tree

8 files changed

+38
-37
lines changed

8 files changed

+38
-37
lines changed

research/object_detection/g3doc/defining_your_own_model.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@ definition as one example. Some remarks:
9494

9595
* We typically initialize the weights of this feature extractor
9696
using those from the
97-
[Slim Resnet-101 classification checkpoint](https://github.com/tensorflow/models/tree/master/slim#pre-trained-models),
97+
[Slim Resnet-101 classification checkpoint](https://github.com/tensorflow/models/tree/master/research/slim#pre-trained-models),
9898
and we know
9999
that images were preprocessed when training this checkpoint
100100
by subtracting a channel mean from each input

research/object_detection/g3doc/exporting_models.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,10 +8,10 @@ graph proto. A checkpoint will typically consist of three files:
88
* model.ckpt-${CHECKPOINT_NUMBER}.meta
99

1010
After you've identified a candidate checkpoint to export, run the following
11-
command from tensorflow/models/object_detection:
11+
command from tensorflow/models/research/object_detection:
1212

1313
``` bash
14-
# From tensorflow/models
14+
# From tensorflow/models/research/
1515
python object_detection/export_inference_graph.py \
1616
--input_type image_tensor \
1717
--pipeline_config_path ${PIPELINE_CONFIG_PATH} \

research/object_detection/g3doc/installation.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Tensorflow Object Detection API depends on the following libraries:
77
* Protobuf 2.6
88
* Pillow 1.0
99
* lxml
10-
* tf Slim (which is included in the "tensorflow/models" checkout)
10+
* tf Slim (which is included in the "tensorflow/models/research/" checkout)
1111
* Jupyter notebook
1212
* Matplotlib
1313
* Tensorflow
@@ -45,23 +45,23 @@ sudo pip install matplotlib
4545
The Tensorflow Object Detection API uses Protobufs to configure model and
4646
training parameters. Before the framework can be used, the Protobuf libraries
4747
must be compiled. This should be done by running the following command from
48-
the tensorflow/models directory:
48+
the tensorflow/models/research/ directory:
4949

5050

5151
``` bash
52-
# From tensorflow/models/
52+
# From tensorflow/models/research/
5353
protoc object_detection/protos/*.proto --python_out=.
5454
```
5555

5656
## Add Libraries to PYTHONPATH
5757

58-
When running locally, the tensorflow/models/ and slim directories should be
59-
appended to PYTHONPATH. This can be done by running the following from
60-
tensorflow/models/:
58+
When running locally, the tensorflow/models/research/ and slim directories
59+
should be appended to PYTHONPATH. This can be done by running the following from
60+
tensorflow/models/research/:
6161

6262

6363
``` bash
64-
# From tensorflow/models/
64+
# From tensorflow/models/research/
6565
export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/slim
6666
```
6767

research/object_detection/g3doc/preparing_inputs.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ To download, extract and convert it to TFRecords, run the following commands
1313
below:
1414

1515
```bash
16-
# From tensorflow/models
16+
# From tensorflow/models/research/
1717
wget http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar
1818
tar -xvf VOCtrainval_11-May-2012.tar
1919
python object_detection/create_pascal_tf_record.py \
@@ -27,7 +27,7 @@ python object_detection/create_pascal_tf_record.py \
2727
```
2828

2929
You should end up with two TFRecord files named `pascal_train.record` and
30-
`pascal_val.record` in the `tensorflow/models` directory.
30+
`pascal_val.record` in the `tensorflow/models/research/` directory.
3131

3232
The label map for the PASCAL VOC data set can be found at
3333
`object_detection/data/pascal_label_map.pbtxt`.
@@ -39,7 +39,7 @@ The Oxford-IIIT Pet data set is located
3939
convert it to TFRecrods, run the following commands below:
4040

4141
```bash
42-
# From tensorflow/models
42+
# From tensorflow/models/research/
4343
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
4444
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
4545
tar -xvf annotations.tar.gz
@@ -51,7 +51,7 @@ python object_detection/create_pet_tf_record.py \
5151
```
5252

5353
You should end up with two TFRecord files named `pet_train.record` and
54-
`pet_val.record` in the `tensorflow/models` directory.
54+
`pet_val.record` in the `tensorflow/models/research/` directory.
5555

5656
The label map for the Pet dataset can be found at
5757
`object_detection/data/pet_label_map.pbtxt`.

research/object_detection/g3doc/running_locally.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Oxford-IIIT Pet dataset.
3333
A local training job can be run with the following command:
3434

3535
```bash
36-
# From the tensorflow/models/ directory
36+
# From the tensorflow/models/research/ directory
3737
python object_detection/train.py \
3838
--logtostderr \
3939
--pipeline_config_path=${PATH_TO_YOUR_PIPELINE_CONFIG} \
@@ -52,7 +52,7 @@ train directory for new checkpoints and evaluate them on a test dataset. The
5252
job can be run using the following command:
5353

5454
```bash
55-
# From the tensorflow/models/ directory
55+
# From the tensorflow/models/research/ directory
5656
python object_detection/eval.py \
5757
--logtostderr \
5858
--pipeline_config_path=${PATH_TO_YOUR_PIPELINE_CONFIG} \

research/object_detection/g3doc/running_notebook.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
If you'd like to hit the ground running and run detection on a few example
44
images right out of the box, we recommend trying out the Jupyter notebook demo.
55
To run the Jupyter notebook, run the following command from
6-
`tensorflow/models/object_detection`:
6+
`tensorflow/models/research/object_detection`:
77

88
```
9-
# From tensorflow/models/object_detection
9+
# From tensorflow/models/research/object_detection
1010
jupyter notebook
1111
```
1212

research/object_detection/g3doc/running_on_cloud.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ packaged (along with it's TF-Slim dependency). The required packages can be
2727
created with the following command
2828

2929
``` bash
30-
# From tensorflow/models/
30+
# From tensorflow/models/research/
3131
python setup.py sdist
3232
(cd slim && python setup.py sdist)
3333
```
@@ -69,7 +69,7 @@ been written, a user can start a training job on Cloud ML Engine using the
6969
following command:
7070

7171
``` bash
72-
# From tensorflow/models/
72+
# From tensorflow/models/research/
7373
gcloud ml-engine jobs submit training object_detection_`date +%s` \
7474
--job-dir=gs://${TRAIN_DIR} \
7575
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz \

research/object_detection/g3doc/running_pets.md

Lines changed: 18 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -51,18 +51,19 @@ dataset for Oxford-IIIT Pets lives
5151
[here](http://www.robots.ox.ac.uk/~vgg/data/pets/). You will need to download
5252
both the image dataset [`images.tar.gz`](http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz)
5353
and the groundtruth data [`annotations.tar.gz`](http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz)
54-
to the `tensorflow/models` directory and unzip them. This may take some time.
54+
to the `tensorflow/models/research/` directory and unzip them. This may take
55+
some time.
5556

5657
``` bash
57-
# From tensorflow/models/
58+
# From tensorflow/models/research/
5859
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
5960
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
6061
tar -xvf images.tar.gz
6162
tar -xvf annotations.tar.gz
6263
```
6364

64-
After downloading the tarballs, your `tensorflow/models` directory should appear
65-
as follows:
65+
After downloading the tarballs, your `tensorflow/models/research/` directory
66+
should appear as follows:
6667

6768
```lang-none
6869
- images.tar.gz
@@ -76,10 +77,10 @@ as follows:
7677
The Tensorflow Object Detection API expects data to be in the TFRecord format,
7778
so we'll now run the `create_pet_tf_record` script to convert from the raw
7879
Oxford-IIIT Pet dataset into TFRecords. Run the following commands from the
79-
`tensorflow/models` directory:
80+
`tensorflow/models/research/` directory:
8081

8182
``` bash
82-
# From tensorflow/models/
83+
# From tensorflow/models/research/
8384
python object_detection/create_pet_tf_record.py \
8485
--label_map_path=object_detection/data/pet_label_map.pbtxt \
8586
--data_dir=`pwd` \
@@ -90,14 +91,14 @@ Note: It is normal to see some warnings when running this script. You may ignore
9091
them.
9192

9293
Two TFRecord files named `pet_train.record` and `pet_val.record` should be
93-
generated in the `tensorflow/models` directory.
94+
generated in the `tensorflow/models/research/` directory.
9495

9596
Now that the data has been generated, we'll need to upload it to Google Cloud
9697
Storage so the data can be accessed by ML Engine. Run the following command to
9798
copy the files into your GCS bucket (substituting `${YOUR_GCS_BUCKET}`):
9899

99100
``` bash
100-
# From tensorflow/models/
101+
# From tensorflow/models/research/
101102
gsutil cp pet_train.record gs://${YOUR_GCS_BUCKET}/data/pet_train.record
102103
gsutil cp pet_val.record gs://${YOUR_GCS_BUCKET}/data/pet_val.record
103104
gsutil cp object_detection/data/pet_label_map.pbtxt gs://${YOUR_GCS_BUCKET}/data/pet_label_map.pbtxt
@@ -145,7 +146,7 @@ upload your edited file onto GCS, making note of the path it was uploaded to
145146
(we'll need it when starting the training/eval jobs).
146147

147148
``` bash
148-
# From tensorflow/models/
149+
# From tensorflow/models/research/
149150

150151
# Edit the faster_rcnn_resnet101_pets.config template. Please note that there
151152
# are multiple places where PATH_TO_BE_CONFIGURED needs to be set.
@@ -187,10 +188,10 @@ Before we can start a job on Google Cloud ML Engine, we must:
187188
2. Write a cluster configuration for our Google Cloud ML job.
188189

189190
To package the Tensorflow Object Detection code, run the following commands from
190-
the `tensorflow/models/` directory:
191+
the `tensorflow/models/research/` directory:
191192

192193
``` bash
193-
# From tensorflow/models/
194+
# From tensorflow/models/research/
194195
python setup.py sdist
195196
(cd slim && python setup.py sdist)
196197
```
@@ -202,11 +203,11 @@ For running the training Cloud ML job, we'll configure the cluster to use 10
202203
training jobs (1 master + 9 workers) and three parameters servers. The
203204
configuration file can be found at `object_detection/samples/cloud/cloud.yml`.
204205

205-
To start training, execute the following command from the `tensorflow/models/`
206-
directory:
206+
To start training, execute the following command from the
207+
`tensorflow/models/research/` directory:
207208

208209
``` bash
209-
# From tensorflow/models/
210+
# From tensorflow/models/research/
210211
gcloud ml-engine jobs submit training `whoami`_object_detection_`date +%s` \
211212
--job-dir=gs://${YOUR_GCS_BUCKET}/train \
212213
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz \
@@ -221,7 +222,7 @@ gcloud ml-engine jobs submit training `whoami`_object_detection_`date +%s` \
221222
Once training has started, we can run an evaluation concurrently:
222223

223224
``` bash
224-
# From tensorflow/models/
225+
# From tensorflow/models/research/
225226
gcloud ml-engine jobs submit training `whoami`_object_detection_eval_`date +%s` \
226227
--job-dir=gs://${YOUR_GCS_BUCKET}/train \
227228
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz \
@@ -288,10 +289,10 @@ three files:
288289
* `model.ckpt-${CHECKPOINT_NUMBER}.meta`
289290

290291
After you've identified a candidate checkpoint to export, run the following
291-
command from `tensorflow/models`:
292+
command from `tensorflow/models/research/`:
292293

293294
``` bash
294-
# From tensorflow/models
295+
# From tensorflow/models/research/
295296
gsutil cp gs://${YOUR_GCS_BUCKET}/train/model.ckpt-${CHECKPOINT_NUMBER}.* .
296297
python object_detection/export_inference_graph.py \
297298
--input_type image_tensor \

0 commit comments

Comments
 (0)