Skip to content

Bump org.apache.parquet:parquet-avro from 1.13.1 to 1.15.2

6e4653d
Select commit
Loading
Failed to load commit list.
Open

Bump org.apache.parquet:parquet-avro from 1.13.1 to 1.15.2 #1405

Bump org.apache.parquet:parquet-avro from 1.13.1 to 1.15.2
6e4653d
Select commit
Loading
Failed to load commit list.
Google Cloud Build / fhir-data-pipes-pr (cloud-build-fhir) failed Jul 16, 2025 in 20m 17s

Summary

Build Information

Trigger fhir-data-pipes-pr
Build d21c5090-e43a-4e2a-8033-e7c72197a663
Start 2025-07-16T09:47:23-07:00
Duration 19m21.919s
Status FAILURE

Steps

Step Status Duration
Launch HAPI Source Server SUCCESS 28.001s
Launch Sink Server Search SUCCESS 25.506s
Launch Sink Server JDBC SUCCESS 25.424s
Wait for the initial Servers Start SUCCESS 1m2.34s
Compile Bunsen and Pipeline SUCCESS 5m24.518s
Build Uploader Image SUCCESS 20.322s
Run Uploader Unit Tests SUCCESS 3.101s
Build E2E Image SUCCESS 2m17.726s
Upload to HAPI SUCCESS 1m14.07s
Build Pipeline Images SUCCESS 21.104s
Run Batch Pipeline in FHIR-search mode with HAPI source SUCCESS 51.41s
Run E2E Test for FHIR-search mode with HAPI source SUCCESS 9.851s
Run Batch Pipeline for JDBC mode with HAPI source SUCCESS 48.151s
Run E2E Test for JDBC mode with HAPI source SUCCESS 9.754s
Run Batch Pipeline for BULK_EXPORT mode with HAPI source SUCCESS 4m14.698s
Run E2E Test for BULK_EXPORT mode with HAPI source SUCCESS 7.808s
Turn down FHIR Sink Server Search SUCCESS 4.444s
Turn down FHIR Sink Server JDBC SUCCESS 4.979s
Create views database SUCCESS 829ms
Bring up controller and Spark containers SUCCESS 11m18.132s
Run E2E Test for Dockerized Controller and Spark Thriftserver FAILURE 53.567s
Bring down controller and Spark containers QUEUED 46.097s
Launch HAPI FHIR Sink Server QUEUED 2m21.923s
Bring up the pipeline controller for FHIR server to FHIR server sync QUEUED 3m59.553s
Run E2E Test for Dockerized Controller in FHIR server to FHIR server sync mode QUEUED 52.285s
Bring down the pipeline controller QUEUED 0s
Turn down HAPI Source Server QUEUED 0s
Turn down FHIR Sink Server Controller for e2e tests QUEUED 0s
Launch OpenMRS Server and HAPI FHIR Sink Server for OpenMRS SUCCESS 0s
Wait for Servers Start SUCCESS 0s
Upload to OpenMRS SUCCESS 0s
Run Batch Pipeline FHIR-search mode with OpenMRS source CANCELLED 0s
Run E2E Test for FHIR-search mode with OpenMRS source QUEUED 0s
Run Batch Pipeline for JDBC mode with OpenMRS source QUEUED 0s
Run E2E Test for JDBC mode with OpenMRS source QUEUED 0s
Test Indicators QUEUED 0s
Turn down Webserver and HAPI Server QUEUED 0s

Details

starting build "d21c5090-e43a-4e2a-8033-e7c72197a663"

FETCHSOURCE
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: 	git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: 	git branch -m <name>
Initialized empty Git repository in /workspace/.git/
From https://github.com/google/fhir-data-pipes
 * branch            6e4653dba1912488468b840231b5ddd00410db3f -> FETCH_HEAD
Updating files:  45% (478/1056)
Updating files:  46% (486/1056)
Updating files:  47% (497/1056)
Updating files:  48% (507/1056)
Updating files:  49% (518/1056)
Updating files:  50% (528/1056)
Updating files:  50% (535/1056)
Updating files:  51% (539/1056)
Updating files:  52% (550/1056)
Updating files:  53% (560/1056)
Updating files:  54% (571/1056)
Updating files:  55% (581/1056)
Updating files:  56% (592/1056)
Updating files:  57% (602/1056)
Updating files:  58% (613/1056)
Updating files:  59% (624/1056)
Updating files:  60% (634/1056)
Updating files:  61% (645/1056)
Updating files:  62% (655/1056)
Updating files:  63% (666/1056)
Updating files:  64% (676/1056)
Updating files:  65% (687/1056)
Updating files:  66% (697/1056)
Updating files:  67% (708/1056)
Updating files:  68% (719/1056)
Updating files:  69% (729/1056)
Updating files:  70% (740/1056)
Updating files:  71% (750/1056)
Updating files:  72% (761/1056)
Updating files:  73% (771/1056)
Updating files:  74% (782/1056)
Updating files:  75% (792/1056)
Updating files:  76% (803/1056)
Updating files:  77% (814/1056)
Updating files:  78% (824/1056)
Updating files:  79% (835/1056)
Updating files:  80% (845/1056)
Updating files:  81% (856/1056)
Updating files:  82% (866/1056)
Updating files:  83% (877/1056)
Updating files:  84% (888/1056)
Updating files:  85% (898/1056)
Updating files:  85% (902/1056)
Updating files:  86% (909/1056)
Updating files:  87% (919/1056)
Updating files:  88% (930/1056)
Updating files:  89% (940/1056)
Updating files:  90% (951/1056)
Updating files:  91% (961/1056)
Updating files:  92% (972/1056)
Updating files:  93% (983/1056)
Updating files:  94% (993/1056)
Updating files:  95% (1004/1056)
Updating files:  96% (1014/1056)
Updating files:  97% (1025/1056)
Updating files:  98% (1035/1056)
Updating files:  99% (1046/1056)
Updating files: 100% (1056/1056)
Updating files: 100% (1056/1056), done.
HEAD is now at 6e4653d Bump org.apache.parquet:parquet-avro from 1.13.1 to 1.15.2
GitCommit:
6e4653dba1912488468b840231b5ddd00410db3f
BUILD
Starting Step #1 - "Launch Sink Server Search"
Starting Step #4 - "Compile Bunsen and Pipeline"
Starting Step #0 - "Launch HAPI Source Server"
Starting Step #2 - "Launch Sink Server JDBC"
Starting Step #5 - "Build Uploader Image"
Starting Step #7 - "Build E2E Image"
Step #4 - "Compile Bunsen and Pipeline": Pulling image: maven:3.8.5-openjdk-17
Step #0 - "Launch HAPI Source Server": Pulling image: docker/compose
Step #2 - "Launch Sink Server JDBC": Pulling image: docker/compose
Step #1 - "Launch Sink Server Search": Pulling image: docker/compose
Step #5 - "Build Uploader Image": Already have image (with digest): gcr.io/cloud-builders/docker
Step #7 - "Build E2E Image": Already have image (with digest): gcr.io/cloud-builders/docker
Step #2 - "Launch Sink Server JDBC": Using default tag: latest
Step #1 - "Launch Sink Server Search": Using default tag: latest
Step #0 - "Launch HAPI Source Server": Using default tag: latest
Step #5 - "Build Uploader Image": Sending build context to Docker daemon  1.466MB

Step #5 - "Build Uploader Image": Step 1/10 : FROM python:3.7-slim
Step #7 - "Build E2E Image": Sending build context to Docker daemon   65.6MB

Step #7 - "Build E2E Image": Step 1/14 : FROM maven:3.8.7-eclipse-temurin-17-focal
Step #1 - "Launch Sink Server Search": latest: Pulling from docker/compose
Step #1 - "Launch Sink Server Search": aad63a933944: Pulling fs layer
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Pulling fs layer
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Pulling fs layer
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Pulling fs layer
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Waiting
Step #2 - "Launch Sink Server JDBC": latest: Pulling from docker/compose
Step #2 - "Launch Sink Server JDBC": aad63a933944: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Waiting
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Verifying Checksum
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Download complete
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Verifying Checksum
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Download complete
Step #4 - "Compile Bunsen and Pipeline": 3.8.5-openjdk-17: Pulling from library/maven
Step #0 - "Launch HAPI Source Server": latest: Pulling from docker/compose
Step #0 - "Launch HAPI Source Server": aad63a933944: Pulling fs layer
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Waiting
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Download complete
Step #2 - "Launch Sink Server JDBC": aad63a933944: Download complete
Step #0 - "Launch HAPI Source Server": aad63a933944: Download complete
Step #1 - "Launch Sink Server Search": aad63a933944: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Waiting
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Waiting
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Waiting
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Waiting
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Waiting
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Waiting
Step #2 - "Launch Sink Server JDBC": aad63a933944: Pull complete
Step #0 - "Launch HAPI Source Server": aad63a933944: Pull complete
Step #1 - "Launch Sink Server Search": aad63a933944: Pull complete
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Pull complete
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Pull complete
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Pull complete
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Verifying Checksum
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Download complete
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Verifying Checksum
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Download complete
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Verifying Checksum
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Download complete
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Verifying Checksum
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Download complete
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Verifying Checksum
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Download complete
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Verifying Checksum
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Download complete
Step #7 - "Build E2E Image": 3.8.7-eclipse-temurin-17-focal: Pulling from library/maven
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Download complete
Step #7 - "Build E2E Image": 7608715873ec: Pulling fs layer
Step #7 - "Build E2E Image": 64a0b7566174: Pulling fs layer
Step #7 - "Build E2E Image": 414e25888ba9: Pulling fs layer
Step #7 - "Build E2E Image": fa1796814410: Pulling fs layer
Step #7 - "Build E2E Image": dc3ab4515b24: Pulling fs layer
Step #7 - "Build E2E Image": 495d1ae42cb9: Pulling fs layer
Step #7 - "Build E2E Image": 66b6d86e5b33: Pulling fs layer
Step #7 - "Build E2E Image": 90062ecd5dec: Pulling fs layer
Step #7 - "Build E2E Image": 414e25888ba9: Waiting
Step #7 - "Build E2E Image": fa1796814410: Waiting
Step #7 - "Build E2E Image": dc3ab4515b24: Waiting
Step #7 - "Build E2E Image": 7608715873ec: Waiting
Step #7 - "Build E2E Image": 90062ecd5dec: Waiting
Step #7 - "Build E2E Image": 66b6d86e5b33: Waiting
Step #7 - "Build E2E Image": 495d1ae42cb9: Waiting
Step #7 - "Build E2E Image": 64a0b7566174: Waiting
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Pull complete
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Pull complete
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Pull complete
Step #5 - "Build Uploader Image": 3.7-slim: Pulling from library/python
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Pull complete
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Pull complete
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Pull complete
Step #1 - "Launch Sink Server Search": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #2 - "Launch Sink Server JDBC": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #0 - "Launch HAPI Source Server": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #1 - "Launch Sink Server Search": Status: Downloaded newer image for docker/compose:latest
Step #2 - "Launch Sink Server JDBC": Status: Downloaded newer image for docker/compose:latest
Step #0 - "Launch HAPI Source Server": Status: Downloaded newer image for docker/compose:latest
Step #1 - "Launch Sink Server Search": docker.io/docker/compose:latest
Step #2 - "Launch Sink Server JDBC": docker.io/docker/compose:latest
Step #0 - "Launch HAPI Source Server": docker.io/docker/compose:latest
Step #5 - "Build Uploader Image": a803e7c4b030: Pulling fs layer
Step #5 - "Build Uploader Image": bf3336e84c8e: Pulling fs layer
Step #5 - "Build Uploader Image": 8973eb85275f: Pulling fs layer
Step #5 - "Build Uploader Image": f9afc3cc0135: Pulling fs layer
Step #5 - "Build Uploader Image": 39312d8b4ab7: Pulling fs layer
Step #5 - "Build Uploader Image": a803e7c4b030: Waiting
Step #5 - "Build Uploader Image": bf3336e84c8e: Waiting
Step #5 - "Build Uploader Image": 8973eb85275f: Waiting
Step #5 - "Build Uploader Image": f9afc3cc0135: Waiting
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Download complete
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Download complete
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Download complete
Step #7 - "Build E2E Image": 7608715873ec: Verifying Checksum
Step #7 - "Build E2E Image": 7608715873ec: Download complete
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pull complete
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Download complete
Step #7 - "Build E2E Image": 64a0b7566174: Verifying Checksum
Step #7 - "Build E2E Image": 64a0b7566174: Download complete
Step #7 - "Build E2E Image": fa1796814410: Verifying Checksum
Step #7 - "Build E2E Image": fa1796814410: Download complete
Step #7 - "Build E2E Image": 495d1ae42cb9: Verifying Checksum
Step #7 - "Build E2E Image": 495d1ae42cb9: Download complete
Step #7 - "Build E2E Image": dc3ab4515b24: Verifying Checksum
Step #7 - "Build E2E Image": dc3ab4515b24: Download complete
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pull complete
Step #7 - "Build E2E Image": 66b6d86e5b33: Download complete
Step #7 - "Build E2E Image": 7608715873ec: Pull complete
Step #7 - "Build E2E Image": 90062ecd5dec: Download complete
Step #5 - "Build Uploader Image": bf3336e84c8e: Verifying Checksum
Step #5 - "Build Uploader Image": bf3336e84c8e: Download complete
Step #5 - "Build Uploader Image": a803e7c4b030: Download complete
Step #5 - "Build Uploader Image": 8973eb85275f: Verifying Checksum
Step #5 - "Build Uploader Image": 8973eb85275f: Download complete
Step #5 - "Build Uploader Image": f9afc3cc0135: Verifying Checksum
Step #5 - "Build Uploader Image": f9afc3cc0135: Download complete
Step #5 - "Build Uploader Image": 39312d8b4ab7: Verifying Checksum
Step #5 - "Build Uploader Image": 39312d8b4ab7: Download complete
Step #7 - "Build E2E Image": 414e25888ba9: Verifying Checksum
Step #7 - "Build E2E Image": 414e25888ba9: Download complete
Step #7 - "Build E2E Image": 64a0b7566174: Pull complete
Step #5 - "Build Uploader Image": a803e7c4b030: Pull complete
Step #5 - "Build Uploader Image": bf3336e84c8e: Pull complete
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Pull complete
Step #5 - "Build Uploader Image": 8973eb85275f: Pull complete
Step #7 - "Build E2E Image": 414e25888ba9: Pull complete
Step #5 - "Build Uploader Image": f9afc3cc0135: Pull complete
Step #7 - "Build E2E Image": fa1796814410: Pull complete
Step #5 - "Build Uploader Image": 39312d8b4ab7: Pull complete
Step #5 - "Build Uploader Image": Digest: sha256:b53f496ca43e5af6994f8e316cf03af31050bf7944e0e4a308ad86c001cf028b
Step #5 - "Build Uploader Image": Status: Downloaded newer image for python:3.7-slim
Step #5 - "Build Uploader Image":  ---> a255ffcb469f
Step #5 - "Build Uploader Image": Step 2/10 : WORKDIR /uploader
Step #5 - "Build Uploader Image":  ---> Running in 896b0d23ccd0
Step #5 - "Build Uploader Image": Removing intermediate container 896b0d23ccd0
Step #5 - "Build Uploader Image":  ---> 001537cbd151
Step #5 - "Build Uploader Image": Step 3/10 : COPY  ./ ./
Step #5 - "Build Uploader Image":  ---> 099453de0ec9
Step #5 - "Build Uploader Image": Step 4/10 : RUN pip install -r requirements.txt
Step #5 - "Build Uploader Image":  ---> Running in 1ee9908de24e
Step #7 - "Build E2E Image": dc3ab4515b24: Pull complete
Step #7 - "Build E2E Image": 495d1ae42cb9: Pull complete
Step #7 - "Build E2E Image": 66b6d86e5b33: Pull complete
Step #7 - "Build E2E Image": 90062ecd5dec: Pull complete
Step #7 - "Build E2E Image": Digest: sha256:ad4b34f02e52164df83182a2a05074b5288d6e6bcc2dfa0ce3d6fa43ec8b557f
Step #7 - "Build E2E Image": Status: Downloaded newer image for maven:3.8.7-eclipse-temurin-17-focal
Step #7 - "Build E2E Image":  ---> 896b49b4d0b7
Step #7 - "Build E2E Image": Step 2/14 : RUN apt-get update && apt-get install -y jq  python3.8 python3-pip
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pull complete
Step #7 - "Build E2E Image":  ---> Running in 5a023c8454d7
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Pull complete
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Pull complete
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pull complete
Step #4 - "Compile Bunsen and Pipeline": Digest: sha256:3a9c30b3af6278a8ae0007d3a3bf00fff80ec3ed7ae4eb9bfa1772853101549b
Step #4 - "Compile Bunsen and Pipeline": Status: Downloaded newer image for maven:3.8.5-openjdk-17
Step #4 - "Compile Bunsen and Pipeline": docker.io/library/maven:3.8.5-openjdk-17
Step #1 - "Launch Sink Server Search": Creating volume "sink-server-search_hapi-data" with default driver
Step #7 - "Build E2E Image": Get:1 http://archive.ubuntu.com/ubuntu focal InRelease [265 kB]
Step #7 - "Build E2E Image": Get:2 http://security.ubuntu.com/ubuntu focal-security InRelease [128 kB]
Step #2 - "Launch Sink Server JDBC": Creating volume "sink-server-jdbc_hapi-data" with default driver
Step #7 - "Build E2E Image": Get:3 http://archive.ubuntu.com/ubuntu focal-updates InRelease [128 kB]
Step #7 - "Build E2E Image": Get:4 http://archive.ubuntu.com/ubuntu focal-backports InRelease [128 kB]
Step #7 - "Build E2E Image": Get:5 http://archive.ubuntu.com/ubuntu focal/universe amd64 Packages [11.3 MB]
Step #0 - "Launch HAPI Source Server": Creating network "hapi-compose_default" with the default driver
Step #7 - "Build E2E Image": Get:6 http://security.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [4,801 kB]
Step #7 - "Build E2E Image": Get:7 http://archive.ubuntu.com/ubuntu focal/restricted amd64 Packages [33.4 kB]
Step #7 - "Build E2E Image": Get:8 http://archive.ubuntu.com/ubuntu focal/main amd64 Packages [1,275 kB]
Step #7 - "Build E2E Image": Get:9 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 Packages [177 kB]
Step #7 - "Build E2E Image": Get:10 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages [4,432 kB]
Step #7 - "Build E2E Image": Get:11 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [1,599 kB]
Step #7 - "Build E2E Image": Get:12 http://security.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [33.1 kB]
Step #7 - "Build E2E Image": Get:13 http://security.ubuntu.com/ubuntu focal-security/universe amd64 Packages [1,308 kB]
Step #7 - "Build E2E Image": Get:14 http://archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [4,998 kB]
Step #7 - "Build E2E Image": Get:15 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [4,919 kB]
Step #7 - "Build E2E Image": Get:16 http://archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [36.8 kB]
Step #0 - "Launch HAPI Source Server": Creating volume "hapi-compose_hapi-fhir-db" with default driver
Step #7 - "Build E2E Image": Get:17 http://archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [55.2 kB]
Step #7 - "Build E2E Image": Get:18 http://archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [28.6 kB]
Step #1 - "Launch Sink Server Search": Pulling sink-server (hapiproject/hapi:latest)...
Step #2 - "Launch Sink Server JDBC": Pulling sink-server (hapiproject/hapi:latest)...
Step #0 - "Launch HAPI Source Server": Creating volume "hapi-compose_hapi-server" with default driver
Step #5 - "Build Uploader Image": Collecting google-auth
Step #5 - "Build Uploader Image":   Downloading google_auth-2.40.3-py2.py3-none-any.whl (216 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 216.1/216.1 kB 5.6 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting mock
Step #5 - "Build Uploader Image":   Downloading mock-5.2.0-py3-none-any.whl (31 kB)
Step #5 - "Build Uploader Image": Collecting requests
Step #5 - "Build Uploader Image":   Downloading requests-2.31.0-py3-none-any.whl (62 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.6/62.6 kB 9.1 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting rsa<5,>=3.1.4
Step #5 - "Build Uploader Image":   Downloading rsa-4.9.1-py3-none-any.whl (34 kB)
Step #5 - "Build Uploader Image": Collecting cachetools<6.0,>=2.0.0
Step #5 - "Build Uploader Image":   Downloading cachetools-5.5.2-py3-none-any.whl (10 kB)
Step #5 - "Build Uploader Image": Collecting pyasn1-modules>=0.2.1
Step #5 - "Build Uploader Image":   Downloading pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.3/181.3 kB 21.6 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting certifi>=2017.4.17
Step #5 - "Build Uploader Image":   Downloading certifi-2025.7.14-py3-none-any.whl (162 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 162.7/162.7 kB 21.1 MB/s eta 0:00:00
Step #4 - "Compile Bunsen and Pipeline": [INFO] Error stacktraces are turned on.
Step #4 - "Compile Bunsen and Pipeline": [INFO] Scanning for projects...
Step #0 - "Launch HAPI Source Server": Pulling db (postgres:)...
Step #5 - "Build Uploader Image": Collecting charset-normalizer<4,>=2
Step #5 - "Build Uploader Image":   Downloading charset_normalizer-3.4.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (141 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 141.3/141.3 kB 18.8 MB/s eta 0:00:00
Step #1 - "Launch Sink Server Search": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image": Collecting idna<4,>=2.5
Step #5 - "Build Uploader Image":   Downloading idna-3.10-py3-none-any.whl (70 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.4/70.4 kB 10.0 MB/s eta 0:00:00
Step #2 - "Launch Sink Server JDBC": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image": Collecting urllib3<3,>=1.21.1
Step #5 - "Build Uploader Image":   Downloading urllib3-2.0.7-py3-none-any.whl (124 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 kB 14.5 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting pyasn1<0.6.0,>=0.4.6
Step #5 - "Build Uploader Image":   Downloading pyasn1-0.5.1-py2.py3-none-any.whl (84 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 84.9/84.9 kB 11.5 MB/s eta 0:00:00
Step #0 - "Launch HAPI Source Server": latest: Pulling from library/postgres
Step #5 - "Build Uploader Image": Installing collected packages: urllib3, pyasn1, mock, idna, charset-normalizer, certifi, cachetools, rsa, requests, pyasn1-modules, google-auth
Step #7 - "Build E2E Image": Fetched 35.7 MB in 3s (13.0 MB/s)
Step #4 - "Compile Bunsen and Pipeline": [INFO] ------------------------------------------------------------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Reactor Build Order:
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] root                                                               [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Parent                                                      [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Extension Structure Definitions                                    [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core                                                        [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core R4                                                     [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core Stu3                                                   [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Avro                                                        [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] FHIR Analytics                                                     [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] common                                                             [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] batch                                                              [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] controller                                                         [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] coverage                                                           [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] Using the MultiThreadedBuilder implementation with a thread count of 64
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] -------------------< com.google.fhir.analytics:root >-------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building root 0.2.7-SNAPSHOT                                      [1/12]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #5 - "Build Uploader Image": Successfully installed cachetools-5.5.2 certifi-2025.7.14 charset-normalizer-3.4.2 google-auth-2.40.3 idna-3.10 mock-5.2.0 pyasn1-0.5.1 pyasn1-modules-0.3.0 requests-2.31.0 rsa-4.9.1 urllib3-2.0.7
Step #5 - "Build Uploader Image": �[91mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.13:prepare-agent (default) @ root ---
Step #5 - "Build Uploader Image": �[0m�[91m
Step #5 - "Build Uploader Image": [notice] A new release of pip is available: 23.0.1 -> 24.0
Step #5 - "Build Uploader Image": [notice] To update, run: pip install --upgrade pip
Step #4 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.13/org.jacoco.agent-0.8.13-runtime.jar=destfile=/workspace/target/jacoco.exec
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- maven-install-plugin:2.4:install (default-install) @ root ---
Step #7 - "Build E2E Image": Reading package lists...
Step #4 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/pom.xml to /root/.m2/repository/com/google/fhir/analytics/root/0.2.7-SNAPSHOT/root-0.2.7-SNAPSHOT.pom
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] ------------------< com.cerner.bunsen:bunsen-parent >-------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building Bunsen Parent 0.5.14-SNAPSHOT                            [2/12]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] ----------------< com.google.fhir.analytics:pipelines >-----------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building FHIR Analytics 0.2.7-SNAPSHOT                            [3/12]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #5 - "Build Uploader Image": �[0mRemoving intermediate container 1ee9908de24e
Step #5 - "Build Uploader Image":  ---> 13fd7fc50870
Step #5 - "Build Uploader Image": Step 5/10 : ENV INPUT_DIR="./test_files"
Step #5 - "Build Uploader Image":  ---> Running in 3b79badf32c7
Step #5 - "Build Uploader Image": Removing intermediate container 3b79badf32c7
Step #5 - "Build Uploader Image":  ---> 757e0964bfcc
Step #5 - "Build Uploader Image": Step 6/10 : ENV CORES=""
Step #5 - "Build Uploader Image":  ---> Running in 5e27c9ed9e07
Step #5 - "Build Uploader Image": Removing intermediate container 5e27c9ed9e07
Step #5 - "Build Uploader Image":  ---> 57b6e6a3c196
Step #5 - "Build Uploader Image": Step 7/10 : ENV CONVERT=""
Step #5 - "Build Uploader Image":  ---> Running in e8c779437770
Step #5 - "Build Uploader Image": Removing intermediate container e8c779437770
Step #5 - "Build Uploader Image":  ---> bcbb3a30c1dc
Step #5 - "Build Uploader Image": Step 8/10 : ENV SINK_TYPE="HAPI"
Step #5 - "Build Uploader Image":  ---> Running in ad1ca43e502e
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.13:prepare-agent (default) @ bunsen-parent ---
Step #4 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.13/org.jacoco.agent-0.8.13-runtime.jar=destfile=/workspace/bunsen/target/jacoco.exec
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:resources (generate-resources) @ bunsen-parent ---
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- directory-maven-plugin:1.0:highest-basedir (directories) @ pipelines ---
Step #5 - "Build Uploader Image": Removing intermediate container ad1ca43e502e
Step #5 - "Build Uploader Image":  ---> b6ea9150ca95
Step #5 - "Build Uploader Image": Step 9/10 : ENV FHIR_ENDPOINT="http://localhost:8098/fhir"
Step #5 - "Build Uploader Image":  ---> Running in 32884bc205a7
Step #4 - "Compile Bunsen and Pipeline": [INFO] skip non existing resourceDirectory /workspace/bunsen/src/main/resources
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- spotless-maven-plugin:2.45.0:apply (default) @ bunsen-parent ---
Step #5 - "Build Uploader Image": Removing intermediate container 32884bc205a7
Step #5 - "Build Uploader Image":  ---> ffa1decd4ced
Step #5 - "Build Uploader Image": Step 10/10 : CMD cd /uploader; python main.py ${SINK_TYPE}     ${FHIR_ENDPOINT} --input_dir ${INPUT_DIR} ${CORES} ${CONVERT}
Step #7 - "Build E2E Image": Reading package lists...
Step #5 - "Build Uploader Image":  ---> Running in ab38203fb669
Step #7 - "Build E2E Image": Building dependency tree...
Step #7 - "Build E2E Image": Reading state information...
Step #5 - "Build Uploader Image": Removing intermediate container ab38203fb669
Step #5 - "Build Uploader Image":  ---> dbeda8a9a995
Step #5 - "Build Uploader Image": Successfully built dbeda8a9a995
Step #5 - "Build Uploader Image": Successfully tagged us-docker.pkg.dev/cloud-build-fhir/fhir-analytics/synthea-uploader:6e4653d
Step #7 - "Build E2E Image": The following additional packages will be installed:
Step #7 - "Build E2E Image":   build-essential cpp cpp-9 dirmngr dpkg-dev fakeroot file g++ g++-9 gcc
Step #7 - "Build E2E Image":   gcc-10-base gcc-9 gcc-9-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent
Step #7 - "Build E2E Image":   gpg-wks-client gpg-wks-server gpgconf gpgsm gpgv libalgorithm-diff-perl
Step #7 - "Build E2E Image":   libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan5 libassuan0
Step #7 - "Build E2E Image":   libatomic1 libc-dev-bin libc6 libc6-dev libcc1-0 libcrypt-dev libdpkg-perl
Step #7 - "Build E2E Image":   libexpat1 libexpat1-dev libfakeroot libfile-fcntllock-perl libgcc-9-dev
Step #7 - "Build E2E Image":   libgcc-s1 libgomp1 libisl22 libitm1 libjq1 libksba8 liblocale-gettext-perl
Step #7 - "Build E2E Image":   liblsan0 libmagic-mgc libmagic1 libmpc3 libmpdec2 libmpfr6 libnpth0 libonig5
Step #7 - "Build E2E Image":   libpython3-dev libpython3-stdlib libpython3.8 libpython3.8-dev
Step #7 - "Build E2E Image":   libpython3.8-minimal libpython3.8-stdlib libquadmath0 libreadline8
Step #7 - "Build E2E Image":   libstdc++-9-dev libstdc++6 libtsan0 libubsan1 linux-libc-dev make manpages
Step #7 - "Build E2E
...
[Logs truncated due to log size limitations. For full logs, see https://storage.cloud.google.com/cloud-build-gh-logs/log-d21c5090-e43a-4e2a-8033-e7c72197a663.txt.]
...
: > Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 200 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: text/plain;charset=UTF-8
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Length: 7
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 16 Jul 2025 17:06:21 GMT
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [7 bytes data]
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100     7  100     7    0     0     25      0 --:--:-- --:--:-- --:--:--    25
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connection #0 to host pipeline-controller left intact
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Note: Unnecessary use of -X or --request, GET is already inferred.
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 192.168.10.11:8080...
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connected to pipeline-controller (192.168.10.11) port 8080 (#0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > GET /status? HTTP/1.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Host: pipeline-controller:8080
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > User-Agent: curl/7.88.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 200 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Transfer-Encoding: chunked
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 16 Jul 2025 17:06:21 GMT
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [68 bytes data]
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100    62    0    62    0     0    797      0 --:--:-- --:--:-- --:--:--   805
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connection #0 to host pipeline-controller left intact
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.420 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=12000
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.529 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=6900
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.536 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=15400
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.592 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=10300
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.619 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=13700
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.718 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=5200
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:21.735 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=8600
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:24.608 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=200
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:25.863 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=1900
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.002 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=3600
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Note: Unnecessary use of -X or --request, GET is already inferred.
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 192.168.10.11:8080...
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connected to pipeline-controller (192.168.10.11) port 8080 (#0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > GET /status? HTTP/1.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Host: pipeline-controller:8080
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > User-Agent: curl/7.88.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 200 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Transfer-Encoding: chunked
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 16 Jul 2025 17:06:26 GMT
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [68 bytes data]
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100    62    0    62    0     0   9776      0 --:--:-- --:--:-- --:--:-- 10333
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connection #0 to host pipeline-controller left intact
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.649 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=7000
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.704 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=12100
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.809 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=10400
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.897 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=13800
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.911 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=15500
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:26.999 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=5300
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:27.059 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=8700
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:29.252 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=300
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:30.985 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=2000
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.321 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=3700
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Note: Unnecessary use of -X or --request, GET is already inferred.
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 192.168.10.11:8080...
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connected to pipeline-controller (192.168.10.11) port 8080 (#0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > GET /status? HTTP/1.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Host: pipeline-controller:8080
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > User-Agent: curl/7.88.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 200 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Transfer-Encoding: chunked
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 16 Jul 2025 17:06:31 GMT
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [68 bytes data]
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100    62    0    62    0     0   8918      0 --:--:-- --:--:-- --:--:-- 10333
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connection #0 to host pipeline-controller left intact
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.585 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=7100
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.638 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=12200
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.736 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=10500
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.759 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=13900
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.787 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=15600
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.923 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=5400
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:31.983 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=8800
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:34.680 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=400
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Note: Unnecessary use of -X or --request, GET is already inferred.
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 192.168.10.11:8080...
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connected to pipeline-controller (192.168.10.11) port 8080 (#0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > GET /status? HTTP/1.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Host: pipeline-controller:8080
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > User-Agent: curl/7.88.1
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 500 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: application/json
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Transfer-Encoding: chunked
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 16 Jul 2025 17:06:36 GMT
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Connection: close
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [113 bytes data]
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100   107    0   107    0     0   2782      0 --:--:-- --:--:-- --:--:--  2815
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Closing connection 0
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:36.765 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=2100
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:36.935 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=3800
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:37.273 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=7200
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:37.388 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=15700
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": java.io.IOException: Could not read footer: java.lang.RuntimeException: file:/workspace/e2e-tests/controller-spark/dwh/controller_DWH_TIMESTAMP_2025_07_16T17_06_21_092598793Z/Patient/ConvertResourceFn_Patient_output-parquet-th-718-ts-1752685594569-r-446242.parquet is not a Parquet file (too small length: 0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:37.593 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=14000
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:37.812 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=10600
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:37.896 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=12300
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:38.096 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=5500
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:38.397 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=8900
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": java.io.IOException: Could not read footer: java.lang.RuntimeException: file:/workspace/e2e-tests/controller-spark/dwh/controller_DWH_TIMESTAMP_2025_07_16T17_06_21_092598793Z/Encounter/ConvertResourceFn_Encounter_output-parquet-th-406-ts-1752685595339-r-789392.parquet is not a Parquet file (too small length: 0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": java.io.IOException: Could not read footer: java.lang.RuntimeException: file:/workspace/e2e-tests/controller-spark/dwh/controller_DWH_TIMESTAMP_2025_07_16T17_06_21_092598793Z/Observation/ConvertResourceFn_Observation_output-parquet-th-1637-ts-1752685598627-r-555743.parquet is not a Parquet file (too small length: 0)
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:40.437 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=500
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:42.748 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=3900
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:43.035 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=2200
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": SUCCESSE2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total patients: 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total encounters: 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total observations: 
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total patient flat rows: 0
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total encounter flat rows: 0
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total observation flat rows: 0
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Mismatch in count of records
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Actual total patients: , expected total: 79
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Actual total encounters: , expected total: 4006
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Total observations: , expected total: 17279
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Actual total materialized view patients: 0, expected total: 528
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Actual total materialized view encounters: 0, expected total: 4006
Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER PARQUET BASED DEPLOYMENT: Actual total materialized view observations: 0, expected total: 17279
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:43.074 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=15800
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:43.166 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=7300
Finished Step #20 - "Run E2E Test for Dockerized Controller and Spark Thriftserver"
ERROR
ERROR: build step 20 "us-docker.pkg.dev/cloud-build-fhir/fhir-analytics/e2e-tests/controller-spark:6e4653d" failed: step exited with non-zero status: 2
Step #31 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 17:06:43.465 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:136 - Fetching 100 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=8beddf0c-3d62-484c-8f13-2b3af1349369&_getpagesoffset=14100
Step #2 - "Launch Sink Server JDBC": �[1A�[2K
Creating sink-server-jdbc ... �[32mdone�[0m
�[1B
Step #1 - "Launch Sink Server Search": �[1A�[2K
Creating sink-server-search ... �[32mdone�[0m
�[1B
Step #0 - "Launch HAPI Source Server": �[1A�[2K
Creating hapi-fhir-db ... �[32mdone�[0m
�[1B�[2A�[2K
Creating hapi-server  ... �[32mdone�[0m
�[2B
Step #19 - "Bring up controller and Spark containers": �[1A�[2K
Creating pipeline-controller ... �[32mdone�[0m
�[1B
Step #28 - "Launch OpenMRS Server and HAPI FHIR Sink Server for OpenMRS": �[1A�[2K
Creating openmrs                 ... �[32mdone�[0m
�[1B

Build Log: https://storage.cloud.google.com/cloud-build-gh-logs/log-d21c5090-e43a-4e2a-8033-e7c72197a663.txt