Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions language/asqa/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ To download the ASQA dataset, run:

```
mkdir dataset
gsutil cp -R gs://gresearch/ASQA/data/ASQA.json dataset
gcloud storage cp --recursive gs://gresearch/ASQA/data/ASQA.json dataset
```

Note: this requires [gsutil](https://cloud.google.com/storage/docs/gsutil).
Expand Down Expand Up @@ -71,7 +71,7 @@ First, save your system output into key-value pairs in json format. Refer to the

```
mkdir outputs
gsutil cp -R gs://gresearch/ASQA/outputs/t5_predictions.json outputs
gcloud storage cp --recursive gs://gresearch/ASQA/outputs/t5_predictions.json outputs
RESULTS_PATH=./outputs/t5_predictions.json

EXP_NAME=t5
Expand Down
2 changes: 1 addition & 1 deletion language/asqa/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ pip install .
# Download Roberta checkpoint.
cd ../
mkdir roberta
gsutil cp -R gs://gresearch/ASQA/ckpts/roberta-squad roberta/
gcloud storage cp --recursive gs://gresearch/ASQA/ckpts/roberta-squad roberta/
2 changes: 1 addition & 1 deletion language/capwap/download.sh
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ mkdir -p "${ROOT}"

# gsutil download
function gsutil_download() {
gsutil cp -r ${1} .
gcloud storage cp --recursive ${1} .
}
GET="gsutil_download"

Expand Down
3 changes: 1 addition & 2 deletions language/frost/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ Alternatively in terminal, follow the instructions and install

```
mkdir frost_composition_sampling
gsutil cp -r gs://gresearch/frost_composition_sampling frost_composition_sampling/
gcloud storage cp --recursive gs://gresearch/frost_composition_sampling frost_composition_sampling/
```

### FROST Pretrained Checkpoint
Expand Down Expand Up @@ -161,4 +161,3 @@ The script generated shareded TFRecords annotated with FROST-style plans. Please
see
[here](https://github.com/google-research/pegasus#add-new-finetuning-dataset) to
use them to finetune FROST models.

2 changes: 1 addition & 1 deletion language/fruit/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ To download the FRUIT-Wiki dataset, run:

```
mkdir fruit_dataset
gsutil cp -R gs://gresearch/FRUIT/dataset fruit_dataset
gcloud storage cp --recursive gs://gresearch/FRUIT/dataset fruit_dataset
```

Note: this requires [gsutil](https://cloud.google.com/storage/docs/gsutil).
Expand Down
2 changes: 1 addition & 1 deletion language/multivec/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ python -m language.multivec.utils.data_processor \
```

### STEP-5: Encode passages & queries
Works with CPU, GPU and TPU. If use TPU inference, we recommend to use v3-8. Also upload passage and query tfr files to gs directory (use gsutil cp command) since TPU can't read from local files.
Works with CPU, GPU and TPU. If use TPU inference, we recommend to use v3-8. Also upload passage and query tfr files to gs directory (use gcloud storage cp command) since TPU can't read from local files.

```bash
INFERENCE_OUTPUT_DIR=<LOCAL_DIR>
Expand Down
2 changes: 1 addition & 1 deletion language/orqa/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ download the data from the Natural Questions cloud bucket:

```bash
mkdir original_nq
gsutil -m cp -R gs://natural_questions/v1.0 original_nq
gcloud storage cp --recursive gs://natural_questions/v1.0 original_nq
cd original_nq
export ORIG_NQ_PATH=$(pwd)
```
Expand Down
4 changes: 2 additions & 2 deletions language/question_answering/bert_joint/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ pip install bert-tensorflow natural-questions
You should then download our model and preprocessed training set with:

```
gsutil cp -R gs://bert-nq/bert-joint-baseline .
gcloud storage cp --recursive gs://bert-nq/bert-joint-baseline .
```

This should give you the preprocessed training set, the model config,
Expand Down Expand Up @@ -56,7 +56,7 @@ We suggest evaluating our pretrained model on the "tiny" dev set to verify that
everything is working correctly. You can download the tiny NQ dev set with:

```
gsutil cp -R gs://bert-nq/tiny-dev .
gcloud storage cp --recursive gs://bert-nq/tiny-dev .
```

You can then evaluate our model on this data with:
Expand Down
2 changes: 1 addition & 1 deletion language/question_answering/decatt_docreader/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ mkdir -p $DATA_DIR
**Download the Natural Questions data:**

```
gsutil -m cp -r gs://natural_questions $DATA_DIR
gcloud storage cp --recursive gs://natural_questions $DATA_DIR
export NQ_DATA_DIR=$DATA_DIR/natural_questions/v1.0
```

Expand Down
30 changes: 15 additions & 15 deletions language/search_agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,24 +67,24 @@ To download the MuZero checkpoints, execute:

```
mkdir -p saved_model/muzero/initial_inference/variables
gsutil cp gs://search_agents/muzero/saved_model/initial_inference/saved_model.pb saved_model/muzero/initial_inference/.
gsutil cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.data-00000-of-00001 saved_model/muzero/initial_inference/variables/.
gsutil cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.index saved_model/muzero/initial_inference/variables/.
gcloud storage cp gs://search_agents/muzero/saved_model/initial_inference/saved_model.pb saved_model/muzero/initial_inference/.
gcloud storage cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.data-00000-of-00001 saved_model/muzero/initial_inference/variables/.
gcloud storage cp gs://search_agents/muzero/saved_model/initial_inference/variables/variables.index saved_model/muzero/initial_inference/variables/.

mkdir -p saved_model/muzero/recurrent_inference/variables
gsutil cp gs://search_agents/muzero/saved_model/recurrent_inference/saved_model.pb saved_model/muzero/recurrent_inference/.
gsutil cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.data-00000-of-00001 saved_model/muzero/recurrent_inference/variables/.
gsutil cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.index saved_model/muzero/recurrent_inference/variables/.
gcloud storage cp gs://search_agents/muzero/saved_model/recurrent_inference/saved_model.pb saved_model/muzero/recurrent_inference/.
gcloud storage cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.data-00000-of-00001 saved_model/muzero/recurrent_inference/variables/.
gcloud storage cp gs://search_agents/muzero/saved_model/recurrent_inference/variables/variables.index saved_model/muzero/recurrent_inference/variables/.
```

To download the T5 checkpoints, execute:

```
mkdir -p saved_model/t5/variables
gsutil cp gs://search_agents/t5/saved_model/saved_model.pb saved_model/t5/.
gsutil cp gs://search_agents/t5/saved_model/variables/variables.data-00000-of-00002 saved_model/t5/variables/.
gsutil cp gs://search_agents/t5/saved_model/variables/variables.data-00001-of-00002 saved_model/t5/variables/.
gsutil cp gs://search_agents/t5/saved_model/variables/variables.index saved_model/t5/variables/.
gcloud storage cp gs://search_agents/t5/saved_model/saved_model.pb saved_model/t5/.
gcloud storage cp gs://search_agents/t5/saved_model/variables/variables.data-00000-of-00002 saved_model/t5/variables/.
gcloud storage cp gs://search_agents/t5/saved_model/variables/variables.data-00001-of-00002 saved_model/t5/variables/.
gcloud storage cp gs://search_agents/t5/saved_model/variables/variables.index saved_model/t5/variables/.
```

[Apache Beam](https://beam.apache.org/) is required to run the evaluation
Expand All @@ -109,11 +109,11 @@ To download all other data dependencies for running the eval pipeline, execute:

```
mkdir data
gsutil cp gs://search_agents/sample_input.jsonl data/.
gsutil cp gs://search_agents/bert/bert_config.json .
gsutil cp gs://search_agents/bert/bert_model.ckpt.data-00000-of-00001 .
gsutil cp gs://search_agents/bert/bert_model.ckpt.index .
gsutil cp gs://search_agents/bert/vocab.txt .
gcloud storage cp gs://search_agents/sample_input.jsonl data/.
gcloud storage cp gs://search_agents/bert/bert_config.json .
gcloud storage cp gs://search_agents/bert/bert_model.ckpt.data-00000-of-00001 .
gcloud storage cp gs://search_agents/bert/bert_model.ckpt.index .
gcloud storage cp gs://search_agents/bert/vocab.txt .
```

The MuZero checkpoints need to be served through
Expand Down