Skip to content

Commit 39ac094

Browse files
authored
Update download urls (#292)
* Prepare for v2.6.0 release Signed-off-by: Abolfazl Shahbazi <[email protected]> * Update download URLs Signed-off-by: Abolfazl Shahbazi <[email protected]> * revert TF v1.15.2 doc changes Signed-off-by: Abolfazl Shahbazi <[email protected]> * fix bert large fp32 build Signed-off-by: Abolfazl Shahbazi <[email protected]>
1 parent ea107ea commit 39ac094

File tree

82 files changed

+120
-120
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

82 files changed

+120
-120
lines changed

quickstart/image_recognition/tensorflow/densenet169/inference/cpu/fp32/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[densenet169-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/densenet169-fp32-inference.tar.gz)
13+
[densenet169-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/densenet169-fp32-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/densenet169-fp32-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/densenet169-fp32-inference.tar.gz
5151
tar -xzf densenet169-fp32-inference.tar.gz
5252
cd densenet169-fp32-inference
5353

quickstart/image_recognition/tensorflow/inceptionv3/inference/cpu/fp32/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[inceptionv3-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv3-fp32-inference.tar.gz)
13+
[inceptionv3-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv3-fp32-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv3-fp32-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv3-fp32-inference.tar.gz
5151
tar -xzf inceptionv3-fp32-inference.tar.gz
5252
cd inceptionv3-fp32-inference
5353

quickstart/image_recognition/tensorflow/inceptionv3/inference/cpu/int8/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[inceptionv3-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv3-int8-inference.tar.gz)
13+
[inceptionv3-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv3-int8-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv3-int8-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv3-int8-inference.tar.gz
5151
tar -xzf inceptionv3-int8-inference.tar.gz
5252
cd inceptionv3-int8-inference
5353

quickstart/image_recognition/tensorflow/inceptionv4/inference/cpu/fp32/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[inceptionv4-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv4-fp32-inference.tar.gz)
13+
[inceptionv4-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv4-fp32-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv4-fp32-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv4-fp32-inference.tar.gz
5151
tar -xzf inceptionv4-fp32-inference.tar.gz
5252
cd inceptionv4-fp32-inference
5353

quickstart/image_recognition/tensorflow/inceptionv4/inference/cpu/int8/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[inceptionv4-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv4-int8-inference.tar.gz)
13+
[inceptionv4-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv4-int8-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/inceptionv4-int8-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/inceptionv4-int8-inference.tar.gz
5151
tar -xzf inceptionv4-int8-inference.tar.gz
5252
cd inceptionv4-int8-inference
5353

quickstart/image_recognition/tensorflow/mobilenet_v1/inference/cpu/fp32/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ImageNet dataset in the TF records format.
1313
<!--- 20. Download link -->
1414
## Download link
1515

16-
[mobilenet-v1-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/mobilenet-v1-fp32-inference.tar.gz)
16+
[mobilenet-v1-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/mobilenet-v1-fp32-inference.tar.gz)
1717

1818
<!--- 30. Datasets -->
1919
## Datasets
@@ -50,7 +50,7 @@ Download and untar the model package and then run a [quickstart script](#quick-s
5050
DATASET_DIR=<path to the preprocessed imagenet dataset>
5151
OUTPUT_DIR=<directory where log files will be written>
5252
53-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/mobilenet-v1-fp32-inference.tar.gz
53+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/mobilenet-v1-fp32-inference.tar.gz
5454
tar -xzf mobilenet-v1-fp32-inference.tar.gz
5555
cd mobilenet-v1-fp32-inference
5656

quickstart/image_recognition/tensorflow/mobilenet_v1/inference/cpu/int8/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[mobilenet-v1-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/mobilenet-v1-int8-inference.tar.gz)
13+
[mobilenet-v1-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/mobilenet-v1-int8-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -50,7 +50,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
5050
DATASET_DIR=<path to the dataset> # This is only for running accuracy
5151
OUTPUT_DIR=<directory where log files will be written>
5252
53-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/mobilenet-v1-int8-inference.tar.gz
53+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/mobilenet-v1-int8-inference.tar.gz
5454
tar -xzf mobilenet-v1-int8-inference.tar.gz
5555
cd mobilenet-v1-int8-inference
5656

quickstart/image_recognition/tensorflow/resnet101/inference/cpu/fp32/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[resnet101-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/resnet101-fp32-inference.tar.gz)
13+
[resnet101-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/resnet101-fp32-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/resnet101-fp32-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/resnet101-fp32-inference.tar.gz
5151
tar -xzf resnet101-fp32-inference.tar.gz
5252
cd resnet101-fp32-inference
5353

quickstart/image_recognition/tensorflow/resnet101/inference/cpu/int8/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Intel-optimized TensorFlow.
1010
<!--- 20. Download link -->
1111
## Download link
1212

13-
[resnet101-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/resnet101-int8-inference.tar.gz)
13+
[resnet101-int8-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/resnet101-int8-inference.tar.gz)
1414

1515
<!--- 30. Datasets -->
1616
## Datasets
@@ -47,7 +47,7 @@ Set environment variables for the path to your `DATASET_DIR` and an
4747
DATASET_DIR=<path to the dataset>
4848
OUTPUT_DIR=<directory where log files will be written>
4949
50-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/resnet101-int8-inference.tar.gz
50+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/resnet101-int8-inference.tar.gz
5151
tar -xzf resnet101-int8-inference.tar.gz
5252
cd resnet101-int8-inference
5353

quickstart/image_recognition/tensorflow/resnet50/inference/cpu/fp32/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ Intel-optimized TensorFlow.
99
<!--- 20. Download link -->
1010
## Download link
1111

12-
[resnet50-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/resnet50-fp32-inference.tar.gz)
12+
[resnet50-fp32-inference.tar.gz](https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/resnet50-fp32-inference.tar.gz)
1313

1414
<!--- 30. Datasets -->
1515
## Datasets
@@ -44,7 +44,7 @@ Download and untar the model package and then run a [quickstart script](#quick-s
4444
DATASET_DIR=<path to the preprocessed imagenet dataset>
4545
OUTPUT_DIR=<directory where log files will be written>
4646
47-
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_5_0/resnet50-fp32-inference.tar.gz
47+
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_6_0/resnet50-fp32-inference.tar.gz
4848
tar -xzf resnet50-fp32-inference.tar.gz
4949
cd resnet50-fp32-inference
5050

0 commit comments

Comments
 (0)