Skip to content
This repository was archived by the owner on Mar 26, 2024. It is now read-only.
This repository was archived by the owner on Mar 26, 2024. It is now read-only.

Load tests failing to run in Ubuntu 18.04 #70

@wadeking98

Description

@wadeking98

I'm using the aries load generator in Ubuntu 18.04. I've ran the script scripts/install-vm.sh and setup/manage.sh start and I'm using the default .env file. When I run setup/manage.sh start all the containers start up, however, the grafana interface shows no test data results. When I inspect the load-generator-1 with sudo docker logs load-generator-1, it shows the following error message:

137335 ERROR c.b.s.g.a.logger.ErrorLogger - type=aries_client_error ariesClientType=AcaPy.IssuerVerifier httpMethod=POST uri=/schemas httpCode=502 durationInMs=459.14613 appErrorCode=n/a message="<html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>nginx/1.23.3</center>
</body>
</html>
" time=1672785798762
137595 INFO  o.s.b.a.l.ConditionEvaluationReportLoggingListener - 

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
137808 ERROR o.s.boot.SpringApplication - Application run failed
java.lang.IllegalStateException: Failed to execute CommandLineRunner

so I inspected the issuer-verifier-nginx container and I found the following error message:

2023/01/03 22:43:18 [error] 29#29: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 172.25.0.23, server: , request: "POST /schemas HTTP/1.1", upstream: "http://172.25.0.11:10000/schemas", host: "issuer-verifier-nginx:10000"
2023/01/03 22:43:18 [error] 29#29: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 172.25.0.23, server: , request: "POST /schemas HTTP/1.1", upstream: "http://172.25.0.13:10000/schemas", host: "issuer-verifier-nginx:10000"
2023/01/03 22:43:18 [error] 29#29: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 172.25.0.23, server: , request: "POST /schemas HTTP/1.1", upstream: "http://172.25.0.14:10000/schemas", host: "issuer-verifier-nginx:10000"
2023/01/03 22:43:18 [error] 29#29: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 172.25.0.23, server: , request: "POST /schemas HTTP/1.1", upstream: "http://172.25.0.16:10000/schemas", host: "issuer-verifier-nginx:10000"
2023/01/03 22:43:18 [error] 29#29: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 172.25.0.23, server: , request: "POST /schemas HTTP/1.1", upstream: "http://172.25.0.15:10000/schemas", host: "issuer-verifier-nginx:10000"

These addresses belong to the issuer-verifier agents. so this much mean that the endpoints aren't active for some reason. This is the log output for agents_issuer-verifier-acapy_1

2023-01-03 23:05:10,752 aries_cloudagent.config.ledger INFO Fetching genesis transactions from: http://host.docker.internal:9000/genesis
2023-01-03 23:05:10,871 aries_cloudagent.core.profile INFO Create profile manager: askar
2023-01-03 23:05:30,904 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:218 | Opened store StoreHandle(1)
2023-01-03 23:05:30,905 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(1) on store StoreHandle(1) (txn: true)
2023-01-03 23:05:30,933 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection
2023-01-03 23:05:30,933 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:105 | Start transaction
2023-01-03 23:05:31,007 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:146 | Commit transaction on close
2023-01-03 23:05:31,007 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:168 | Dropped pool connection
2023-01-03 23:05:31,007 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:1007 | Closed session SessionHandle(1)
2023-01-03 23:05:31,008 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(2) on store StoreHandle(1) (txn: false)
2023-01-03 23:05:31,356 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(3) on store StoreHandle(1) (txn: false)
2023-01-03 23:05:31,370 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection
2023-01-03 23:05:31,377 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:1007 | Closed session SessionHandle(3)
2023-01-03 23:05:31,378 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:168 | Dropped pool connection
2023-01-03 23:05:31,378 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(4) on store StoreHandle(1) (txn: false)
2023-01-03 23:05:31,385 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection
2023-01-03 23:05:31,420 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:1007 | Closed session SessionHandle(4)
2023-01-03 23:05:31,420 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:168 | Dropped pool connection
2023-01-03 23:05:31,421 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(5) on store StoreHandle(1) (txn: false)
2023-01-03 23:05:31,426 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection
2023-01-03 23:05:31,428 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:1007 | Closed session SessionHandle(5)
2023-01-03 23:05:31,428 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:168 | Dropped pool connection
2023-01-03 23:05:31,460 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(6) on store StoreHandle(1) (txn: false)
2023-01-03 23:05:31,499 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection
2023-01-03 23:05:31,529 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:1007 | Closed session SessionHandle(6)
2023-01-03 23:05:31,530 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:168 | Dropped pool connection
2023-01-03 23:05:31,530 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:527 | Started session SessionHandle(7) on store StoreHandle(1) (txn: false)
2023-01-03 23:05:31,533 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection
2023-01-03 23:05:31,534 aries_askar.native.aries_askar.ffi.store INFO   src/ffi/store.rs:1007 | Closed session SessionHandle(7)
2023-01-03 23:05:31,535 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:168 | Dropped pool connection
2023-01-03 23:05:31,642 aries_askar.native.aries_askar.backend.db_utils INFO    src/backend/db_utils.rs:102 | Acquire pool connection

The logs on the other issuer-verifier agents are either blank or just contain the top two lines.
I noticed that the holder agents seem to startup fine and the only real difference between the two is that the holder agents use sqlite wallet storage, while the issuer-verifiers use postgres storage. I inspected the data base container issuer-verifier-wallet-db and it shows the following error:

2023-01-03 22:39:58.585 UTC [1] LOG:  starting PostgreSQL 13.1 on x86_64-pc-linux-musl, compiled by gcc (Alpine 10.2.1_pre1) 10.2.1 20201203, 64-bit
2023-01-03 22:39:58.590 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5000
2023-01-03 22:39:58.591 UTC [1] LOG:  listening on IPv6 address "::", port 5000
2023-01-03 22:39:58.620 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5000"
2023-01-03 22:39:58.647 UTC [47] LOG:  database system was shut down at 2023-01-03 22:39:58 UTC
2023-01-03 22:39:58.666 UTC [1] LOG:  database system is ready to accept connections
2023-01-03 22:40:14.519 UTC [56] FATAL:  database "wallet_db" does not exist
2023-01-03 22:40:14.522 UTC [57] FATAL:  database "wallet_db" does not exist

It's saying that wallet_db doesn't exist which is strange because if I connect to it using psql I can connect to the database and view tables:
image
image

I'm not really sure what is going on here, is this a timing issue? Maybe the agents are starting up before postgres makes a wallet_db database? Any help would be appreciated

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions