Skip to content

Commit

Permalink
Merge branch 'develop'
Browse files Browse the repository at this point in the history
  • Loading branch information
rsasch committed Jun 26, 2019
2 parents 98ba596 + a439100 commit a52c76e
Show file tree
Hide file tree
Showing 212 changed files with 3,980 additions and 1,092 deletions.
96 changes: 74 additions & 22 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ sudo: required
dist: trusty
services:
- docker
- mysql
language: scala
scala:
- 2.12.6
Expand All @@ -22,27 +21,80 @@ before_cache:
env:
matrix:
# Setting this variable twice will cause the 'script' section to run twice with the respective env var invoked
- BUILD_TYPE=centaurAws
- BUILD_TYPE=centaurBcs
- BUILD_TYPE=centaurEngineUpgradeLocal
- BUILD_TYPE=centaurEngineUpgradePapiV2
- BUILD_TYPE=centaurHoricromtalPapiV2
- BUILD_TYPE=centaurHoricromtalEngineUpgradePapiV2
- BUILD_TYPE=centaurPapiUpgradePapiV1
- BUILD_TYPE=centaurPapiUpgradeNewWorkflowsPapiV1
- BUILD_TYPE=centaurLocal
- BUILD_TYPE=centaurPapiV1
- BUILD_TYPE=centaurPapiV2
- BUILD_TYPE=centaurSlurm
- BUILD_TYPE=centaurTes
- BUILD_TYPE=centaurWdlUpgradeLocal
- BUILD_TYPE=checkPublish
- BUILD_TYPE=conformanceLocal
- BUILD_TYPE=conformancePapiV2
- BUILD_TYPE=conformanceTesk
- BUILD_TYPE=dockerDeadlock
- BUILD_TYPE=dockerScripts
- BUILD_TYPE=sbt
- >-
BUILD_TYPE=centaurAws
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurBcs
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurEngineUpgradeLocal
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurEngineUpgradePapiV2
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurHoricromtalPapiV2
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurHoricromtalPapiV2
BUILD_MARIADB=10.3
- >-
BUILD_TYPE=centaurHoricromtalEngineUpgradePapiV2
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurHoricromtalEngineUpgradePapiV2
BUILD_MARIADB=10.3
- >-
BUILD_TYPE=centaurPapiUpgradePapiV1
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurPapiUpgradeNewWorkflowsPapiV1
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurLocal
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurLocal
BUILD_POSTGRESQL=11.3
- >-
BUILD_TYPE=centaurPapiV1
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurPapiV2
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurSlurm
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurTes
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=centaurWdlUpgradeLocal
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=checkPublish
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=conformanceLocal
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=conformancePapiV2
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=conformanceTesk
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=dockerDeadlock
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=dockerScripts
BUILD_MYSQL=5.7
- >-
BUILD_TYPE=sbt
BUILD_MYSQL=5.7
BUILD_POSTGRESQL=11.3
BUILD_MARIADB=10.3
script:
- src/ci/bin/test.sh
notifications:
Expand Down
62 changes: 62 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,67 @@
# Cromwell Change Log

## 43 Release Notes

### Virtual Private Cloud with Subnetworks

Cromwell now allows PAPIV2 jobs to run on a specific subnetwork inside a private network by adding the subnetwork key
`subnetwork-label-key` inside `virtual-private-cloud` in backend configuration. More info [here](https://cromwell.readthedocs.io/en/stable/backends/Google/).

### Call caching database refactoring

Cromwell's `CALL_CACHING_HASH_ENTRY` primary key has been refactored to use a `BIGINT` datatype in place of the previous
`INT` datatype. Cromwell will not be usable during the time the Liquibase migration for this refactor is running.
In the Google Cloud SQL with SSD environment this migration runs at a rate of approximately 100,000 `CALL_CACHING_HASH_ENTRY`
rows per second. In deployments with millions or billions of `CALL_CACHING_HASH_ENTRY` rows the migration may require
a significant amount of downtime so please plan accordingly. The following SQL could be used to estimate the number of
rows in this table:

```
select max(CALL_CACHING_HASH_ENTRY_ID) from CALL_CACHING_HASH_ENTRY
```

### Stackdriver Instrumentation

Cromwell now supports sending metrics to [Google's Stackdriver API](https://cloud.google.com/monitoring/api/v3/).
Learn more on how to configure [here](https://cromwell.readthedocs.io/en/stable/developers/Instrumentation/).

### BigQuery in PAPI

Cromwell now allows a user to specify BigQuery jobs when using the PAPIv2 backend

### Configuration Changes

#### StatsD Instrumentation

There is a small change in StatsD's configuration path. Originally, the path to the config was `services.Instrumentation.config.statsd`
which now has been updated to `services.Instrumentation.config`. More info on its configuration can be found
[here](https://cromwell.readthedocs.io/en/stable/developers/Instrumentation/).

#### cached-copy

A new experimental feature, the `cached-copy` localization strategy is available for the shared filesystem.
More information can be found in the [documentation on localization](https://cromwell.readthedocs.io/en/stable/backends/HPC).

#### Yaml node limits

Yaml parsing now checks for cycles, and limits the maximum number of parsed nodes to a configurable value. It also
limits the nesting depth of sequences and mappings. See [the documentation on configuring
YAML](https://cromwell.readthedocs.io/en/stable/Configuring/#yaml) for more information.

### API Changes

#### Workflow Metadata

* It is now possible to use `includeKey` and `excludeKey` at the same time. If so, the metadata key must match the `includeKey` **and not** match the `excludeKey` to be included.
* It is now possible to use "`calls`" as one of your `excludeKey`s, to request that only workflow metadata gets returned.

### PostgreSQL support

Cromwell now supports PostgreSQL (version 9.6 or higher, with the Large Object
extension installed) as a database backend.
See [here](https://cromwell.readthedocs.io/en/stable/Configuring/#database) for
instructions for configuring the database connection.

## 42 Release Notes

### Womtool endpoint
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import scala.concurrent.Future
import scala.util.Try

trait ReadLikeFunctions extends PathFactory with IoFunctionSet with AsyncIoFunctions {

override def readFile(path: String, maxBytes: Option[Int], failOnOverflow: Boolean): Future[String] =
Future.fromTry(Try(buildPath(path))) flatMap { p => asyncIo.contentAsStringAsync(p, maxBytes, failOnOverflow) }

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
version 1.0

workflow cached_inputs {
Array[Int] one_to_ten = [1,2,3,4,5,6,7,8,9,10]

call ten_lines

scatter (x in one_to_ten) {
call read_line {
input:
file=ten_lines.text,
line_number=x
}
}
output {
Array[String] lines = read_line.line
}
}

task ten_lines {
command {
echo "Line 1
Line 2
Line 3
Line 4
Line 5
Line 6
Line 7
Line 8
Line 9
Line 10" > outfile.txt
}
output {
File text = "outfile.txt"
}
runtime {
docker: "ubuntu:latest"
}
}

task read_line {
input {
File file
Int line_number
}
command {
sed -n ~{line_number}p ~{file}
}
output {
String line = read_string(stdout())
}
runtime {
docker: "ubuntu:latest"
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,14 @@ backends: [Papiv2-Virtual-Private-Cloud]

files {
workflow: virtual_private_cloud/check_network_in_vpc.wdl
options: virtual_private_cloud/wf_zone_options.json
}

metadata {
workflowName: check_network_in_vpc
status: Succeeded

"outputs.check_network_in_vpc.network_used": "cromwell-ci-vpc-network"
"outputs.check_network_in_vpc.subnetwork_used": "cromwell-ci-vpc-network"
"outputs.check_network_in_vpc.zone_used": "us-east1-c"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
name: dollars_in_strings
testFormat: workflowsuccess

files {
workflow: dollars_in_strings/dollars_in_strings.wdl
}

metadata {
workflowName: read_dollared_strings
status: Succeeded
"outputs.read_dollared_strings.s1": "${BLAH}"
"outputs.read_dollared_strings.s2": "${BLAH}"
"outputs.read_dollared_strings.s3": "oops ${BLAH}"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
workflow read_dollared_strings {

call dollars_in_strings

String dollar = "$"

output {
String s1 = "${dollar}{BLAH}"
String s2 = s1

String s3 = dollars_in_strings.s3
}
}


task dollars_in_strings {
String dollar = "$"
command <<<
cat > foo.txt << 'EOF'
oops ${dollar}{BLAH}
EOF
>>>
output {
File x = "foo.txt"
String s3 = read_string(x)
}
runtime {
docker: "ubuntu:latest"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
version 1.0

workflow wf_level_file_size {
File input1 = "dos://wb-mock-drs-dev.storage.googleapis.com/4a3908ad-1f0b-4e2a-8a92-611f2123e8b0"
File input2 = "dos://wb-mock-drs-dev.storage.googleapis.com/0c8e7bc6-fd76-459d-947b-808b0605beb3"

output {
Float fileSize1 = size(input1)
Float fileSize2 = size(input2)
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
name: drs_wf_level_read_size
testFormat: workflowsuccess
backends: [Papiv2NoDockerHubConfig]

files {
workflow: drs_tests/wf_level_file_size.wdl
}

metadata {
workflowName: wf_level_file_size
status: Succeeded

"outputs.wf_level_file_size.fileSize1": 43.0
"outputs.wf_level_file_size.fileSize2": 45.0
}


Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,6 @@ submit {
statusCode: 400
message: """{
"status": "fail",
"message": "Error(s): Input file is not a valid yaml or json. Inputs data: ''. Error: MatchError: null."
"message": "Error(s): Input file is not a valid yaml or json. Inputs data: ''. Error: ParsingFailure: null."
}"""
}
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ task get_machine_info {

runtime {
docker: "nvidia/cuda:9.0-cudnn7-devel-ubuntu16.04"
bootDiskSizeGb: 20
gpuType: "nvidia-tesla-k80"
gpuCount: 1
nvidiaDriverVersion: driver_version
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,17 @@ version 1.0

task get_network {
command {
set -euo pipefail

apt-get install --assume-yes jq > /dev/null
INSTANCE=$(curl -s "http://metadata.google.internal/computeMetadata/v1/instance/name" -H "Metadata-Flavor: Google")
ZONE=$(curl -s "http://metadata.google.internal/computeMetadata/v1/instance/zone" -H "Metadata-Flavor: Google" | sed -E 's!.*/(.*)!\1!')
TOKEN=$(gcloud auth application-default print-access-token)
INSTANCE_METADATA=$(curl "https://www.googleapis.com/compute/v1/projects/broad-dsde-cromwell-dev/zones/$ZONE/instances/$INSTANCE" -H "Authorization: Bearer $TOKEN" -H 'Accept: application/json')
echo $INSTANCE_METADATA | jq -r '.networkInterfaces[0].network' | sed -E 's!.*/(.*)!\1!'
NETWORK_OBJECT=$(echo $INSTANCE_METADATA | jq --raw-output --exit-status '.networkInterfaces[0]')
echo $NETWORK_OBJECT | jq --exit-status '.network' | sed -E 's!.*/(.*)!\1!' | sed 's/"//g' > network
echo $NETWORK_OBJECT | jq --exit-status '.subnetwork' | sed -E 's!.*/(.*)!\1!' | sed 's/"//g' > subnetwork
echo $ZONE > zone
}

runtime {
Expand All @@ -16,14 +21,19 @@ task get_network {
}

output {
String network = read_string(stdout())
String networkName = read_string("network")
String subnetworkName = read_string("subnetwork")
String zone = read_string("zone")
}
}

workflow check_network_in_vpc {
call get_network

output {
String network_used = get_network.network
String network_used = get_network.networkName
String subnetwork_used = get_network.subnetworkName
String zone_used = get_network.zone
}
}

Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"default_runtime_attributes": {
"zones": "us-east1-c"
}
}
Loading

0 comments on commit a52c76e

Please sign in to comment.