diff --git a/language/asqa/README.md b/language/asqa/README.md index 5563101a..2f1a51bc 100644 --- a/language/asqa/README.md +++ b/language/asqa/README.md @@ -18,7 +18,7 @@ To download the ASQA dataset, run: ``` mkdir dataset -gsutil cp -R gs://gresearch/ASQA/data/ASQA.json dataset +gcloud storage cp --recursive gs://gresearch/ASQA/data/ASQA.json dataset ``` Note: this requires [gsutil](https://cloud.google.com/storage/docs/gsutil). @@ -71,7 +71,7 @@ First, save your system output into key-value pairs in json format. Refer to the ``` mkdir outputs -gsutil cp -R gs://gresearch/ASQA/outputs/t5_predictions.json outputs +gcloud storage cp --recursive gs://gresearch/ASQA/outputs/t5_predictions.json outputs RESULTS_PATH=./outputs/t5_predictions.json EXP_NAME=t5 diff --git a/language/asqa/install.sh b/language/asqa/install.sh index ee6f6434..91bb50e0 100644 --- a/language/asqa/install.sh +++ b/language/asqa/install.sh @@ -26,4 +26,4 @@ pip install . # Download Roberta checkpoint. cd ../ mkdir roberta -gsutil cp -R gs://gresearch/ASQA/ckpts/roberta-squad roberta/ +gcloud storage cp --recursive gs://gresearch/ASQA/ckpts/roberta-squad roberta/ diff --git a/language/capwap/download.sh b/language/capwap/download.sh index 9d3fabf3..0c238e25 100644 --- a/language/capwap/download.sh +++ b/language/capwap/download.sh @@ -38,7 +38,7 @@ mkdir -p "${ROOT}" # gsutil download function gsutil_download() { - gsutil cp -r ${1} . + gcloud storage cp --recursive ${1} . } GET="gsutil_download" diff --git a/language/frost/README.md b/language/frost/README.md index c5ea7e6c..28e954a3 100644 --- a/language/frost/README.md +++ b/language/frost/README.md @@ -69,7 +69,7 @@ Alternatively in terminal, follow the instructions and install ``` mkdir frost_composition_sampling -gsutil cp -r gs://gresearch/frost_composition_sampling frost_composition_sampling/ +gcloud storage cp --recursive gs://gresearch/frost_composition_sampling frost_composition_sampling/ ``` ### FROST Pretrained Checkpoint @@ -161,4 +161,3 @@ The script generated shareded TFRecords annotated with FROST-style plans. Please see [here](https://github.com/google-research/pegasus#add-new-finetuning-dataset) to use them to finetune FROST models. - diff --git a/language/fruit/README.md b/language/fruit/README.md index fb02b4f0..2291bcb4 100644 --- a/language/fruit/README.md +++ b/language/fruit/README.md @@ -17,7 +17,7 @@ To download the FRUIT-Wiki dataset, run: ``` mkdir fruit_dataset -gsutil cp -R gs://gresearch/FRUIT/dataset fruit_dataset +gcloud storage cp --recursive gs://gresearch/FRUIT/dataset fruit_dataset ``` Note: this requires [gsutil](https://cloud.google.com/storage/docs/gsutil). diff --git a/language/multivec/README.md b/language/multivec/README.md index 3befc88b..5b099a51 100644 --- a/language/multivec/README.md +++ b/language/multivec/README.md @@ -153,7 +153,7 @@ python -m language.multivec.utils.data_processor \ ``` ### STEP-5: Encode passages & queries -Works with CPU, GPU and TPU. If use TPU inference, we recommend to use v3-8. Also upload passage and query tfr files to gs directory (use gsutil cp command) since TPU can't read from local files. +Works with CPU, GPU and TPU. If use TPU inference, we recommend to use v3-8. Also upload passage and query tfr files to gs directory (use gcloud storage cp command) since TPU can't read from local files. ```bash INFERENCE_OUTPUT_DIR= diff --git a/language/orqa/README.md b/language/orqa/README.md index e0b51f9b..c12fd5fd 100644 --- a/language/orqa/README.md +++ b/language/orqa/README.md @@ -47,7 +47,7 @@ download the data from the Natural Questions cloud bucket: ```bash mkdir original_nq -gsutil -m cp -R gs://natural_questions/v1.0 original_nq +gcloud storage cp --recursive gs://natural_questions/v1.0 original_nq cd original_nq export ORIG_NQ_PATH=$(pwd) ``` diff --git a/language/question_answering/bert_joint/README.md b/language/question_answering/bert_joint/README.md index 12361fe5..2e450157 100644 --- a/language/question_answering/bert_joint/README.md +++ b/language/question_answering/bert_joint/README.md @@ -16,7 +16,7 @@ pip install bert-tensorflow natural-questions You should then download our model and preprocessed training set with: ``` -gsutil cp -R gs://bert-nq/bert-joint-baseline . +gcloud storage cp --recursive gs://bert-nq/bert-joint-baseline . ``` This should give you the preprocessed training set, the model config, @@ -56,7 +56,7 @@ We suggest evaluating our pretrained model on the "tiny" dev set to verify that everything is working correctly. You can download the tiny NQ dev set with: ``` -gsutil cp -R gs://bert-nq/tiny-dev . +gcloud storage cp --recursive gs://bert-nq/tiny-dev . ``` You can then evaluate our model on this data with: diff --git a/language/question_answering/decatt_docreader/README.md b/language/question_answering/decatt_docreader/README.md index 57018ea6..9170c397 100644 --- a/language/question_answering/decatt_docreader/README.md +++ b/language/question_answering/decatt_docreader/README.md @@ -9,7 +9,7 @@ mkdir -p $DATA_DIR **Download the Natural Questions data:** ``` -gsutil -m cp -r gs://natural_questions $DATA_DIR +gcloud storage cp --recursive gs://natural_questions $DATA_DIR export NQ_DATA_DIR=$DATA_DIR/natural_questions/v1.0 ``` diff --git a/language/search_agents/README.md b/language/search_agents/README.md index 7b277b77..514f76a4 100644 --- a/language/search_agents/README.md +++ b/language/search_agents/README.md @@ -67,24 +67,24 @@ To download the MuZero checkpoints, execute: ``` mkdir -p saved_model/muzero/initial_inference/variables -gsutil cp gs://search_agents/muzero/saved_model/initial_inference/saved_model.pb saved_model/muzero/initial_inference/. -gsutil cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.data-00000-of-00001 saved_model/muzero/initial_inference/variables/. -gsutil cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.index saved_model/muzero/initial_inference/variables/. +gcloud storage cp gs://search_agents/muzero/saved_model/initial_inference/saved_model.pb saved_model/muzero/initial_inference/. +gcloud storage cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.data-00000-of-00001 saved_model/muzero/initial_inference/variables/. +gcloud storage cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.index saved_model/muzero/initial_inference/variables/. mkdir -p saved_model/muzero/recurrent_inference/variables -gsutil cp gs://search_agents/muzero/saved_model/recurrent_inference/saved_model.pb saved_model/muzero/recurrent_inference/. -gsutil cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.data-00000-of-00001 saved_model/muzero/recurrent_inference/variables/. -gsutil cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.index saved_model/muzero/recurrent_inference/variables/. +gcloud storage cp gs://search_agents/muzero/saved_model/recurrent_inference/saved_model.pb saved_model/muzero/recurrent_inference/. +gcloud storage cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.data-00000-of-00001 saved_model/muzero/recurrent_inference/variables/. +gcloud storage cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.index saved_model/muzero/recurrent_inference/variables/. ``` To download the T5 checkpoints, execute: ``` mkdir -p saved_model/t5/variables -gsutil cp gs://search_agents/t5/saved_model/saved_model.pb saved_model/t5/. -gsutil cp gs://search_agents/t5/saved_model/variables/variables.data-00000-of-00002 saved_model/t5/variables/. -gsutil cp gs://search_agents/t5/saved_model/variables/variables.data-00001-of-00002 saved_model/t5/variables/. -gsutil cp gs://search_agents/t5/saved_model/variables/variables.index saved_model/t5/variables/. +gcloud storage cp gs://search_agents/t5/saved_model/saved_model.pb saved_model/t5/. +gcloud storage cp gs://search_agents/t5/saved_model/variables/variables.data-00000-of-00002 saved_model/t5/variables/. +gcloud storage cp gs://search_agents/t5/saved_model/variables/variables.data-00001-of-00002 saved_model/t5/variables/. +gcloud storage cp gs://search_agents/t5/saved_model/variables/variables.index saved_model/t5/variables/. ``` [Apache Beam](https://beam.apache.org/) is required to run the evaluation @@ -109,11 +109,11 @@ To download all other data dependencies for running the eval pipeline, execute: ``` mkdir data -gsutil cp gs://search_agents/sample_input.jsonl data/. -gsutil cp gs://search_agents/bert/bert_config.json . -gsutil cp gs://search_agents/bert/bert_model.ckpt.data-00000-of-00001 . -gsutil cp gs://search_agents/bert/bert_model.ckpt.index . -gsutil cp gs://search_agents/bert/vocab.txt . +gcloud storage cp gs://search_agents/sample_input.jsonl data/. +gcloud storage cp gs://search_agents/bert/bert_config.json . +gcloud storage cp gs://search_agents/bert/bert_model.ckpt.data-00000-of-00001 . +gcloud storage cp gs://search_agents/bert/bert_model.ckpt.index . +gcloud storage cp gs://search_agents/bert/vocab.txt . ``` The MuZero checkpoints need to be served through