This repository contains code and tests that interact with a Langfuse instance (self-hosted). This README explains how to start Langfuse, run the application, run unit tests, and run the integration tests that call a live Langfuse instance.
- Java 21 (or the JDK configured for the project)
- Maven (the project includes the Maven wrapper
mvnw) - Docker & Docker Compose (required to run Langfuse locally via the provided compose file)
- A running Langfuse instance on
http://localhost:3000(the default used in tests)
If you prefer to use the included scripts, make sure they have execute permission:
chmod +x start-langfuse.sh run-integration-tests.shOption A — start with the helper script (recommended):
./start-langfuse.shOption B — use Docker Compose directly:
docker-compose -f docker-compose.langfuse.yml up -dVerify Langfuse is reachable:
curl -I http://localhost:3000
# or
curl -s -o /dev/null -w "%{http_code}" http://localhost:3000Expected HTTP codes: 200 or 302 depending on the server setup.
Integration tests require API keys to be present in src/main/resources/application.properties (the run-integration-tests.sh script also checks these properties). Add or edit these properties:
langfuse.public-key=your_public_key_here
langfuse.secret-key=your_secret_key_here
langfuse.url=http://localhost:3000
Note: Tests use Basic Auth constructed from these keys when calling the Langfuse public API.
Build and run using the Maven wrapper:
./mvnw -DskipTests=true spring-boot:runOr build a jar and run it:
./mvnw package
java -jar target/*.jarTo run all unit tests:
./mvnw testTo run a specific unit test class (example):
./mvnw test -Dtest=LangfuseServiceTestUnit tests are fast and do not require a running Langfuse instance (they use mocks in test resources).
Integration tests in this repository actually call a live Langfuse instance. They are protected by an environment variable guard to avoid accidental runs in environments without Langfuse.
The easiest way to run them is the provided runner script which verifies Langfuse availability and API keys, sets the guard variable, and runs the integration test class:
./run-integration-tests.shWhat the script does:
- Checks
http://localhost:3000is up. If not, it offers to start Langfuse using./start-langfuse.sh. - Verifies
langfuse.public-keyandlangfuse.secret-keyexist insrc/main/resources/application.properties. - Exports
LANGFUSE_INTEGRATION_TEST=trueand runs:./mvnw test -Dtest=LangfuseServiceIntegrationTest
Manual alternative (without the script):
export LANGFUSE_INTEGRATION_TEST=true
./mvnw test -Dtest=LangfuseServiceIntegrationTestNotes and tips:
- The integration tests expect a Langfuse server at
http://localhost:3000by default. If your server is elsewhere, update the properties file or the test configuration. - Some Langfuse endpoints vary by deployment. Tests are defensive and may accept empty results if the public API on your instance does not expose admin endpoints (for example, per-project apiKeys endpoints may return 404 on some Langfuse builds). See
docs/WHY_WEBCLIENT_NOT_SDK.mdanddocs/INTEGRATION_TESTS.mdfor more details.
- Langfuse is not running or port unreachable: Start Langfuse via
./start-langfuse.shordocker-compose -f docker-compose.langfuse.yml up -dand re-run the script. - API keys not found: Ensure
src/main/resources/application.propertiescontainslangfuse.public-keyandlangfuse.secret-key. The integration runner checks for these. - Logs: Inspect Docker Compose logs:
docker-compose -f docker-compose.langfuse.yml logs- If an integration test fails with 404/405, check whether the Langfuse instance exposes the specific public endpoints the tests call. Some endpoints are admin-only and may not be present in public builds.
Do not run integration tests in CI unless the pipeline provides a running Langfuse instance and configured API keys. Unit tests are safe to run in CI.
- Integration test runner script:
run-integration-tests.sh - Langfuse startup script:
start-langfuse.sh - Reasoning for using WebClient vs official SDK:
docs/WHY_WEBCLIENT_NOT_SDK.md - Integration test notes:
docs/INTEGRATION_TESTS.md
If you need to run integration tests against a remote Langfuse or adjust endpoints, update src/main/resources/application.properties or modify the integration tests to point at a different langfuse.url and API keys.
Happy testing!
We keep a minimal src/main/resources/application.properties in the repository with non-sensitive defaults and placeholders. For local development you should use the dev profile which contains example (non-production) credentials and convenience settings.
Files involved
src/main/resources/application.properties— canonical repo file; contains placeholders and global settings. Sensitive values are NOT stored here.src/main/resources/application-dev.properties— local development profile (example values). Activate this profile to load local DB credentials and enable convenient dev settings (e.g.spring.jpa.hibernate.ddl-auto=update).
How to run locally using the dev profile
- Use the Maven wrapper (recommended):
./mvnw -Dspring-boot.run.profiles=dev -DskipTests spring-boot:run- Or build and run the jar with the profile active:
./mvnw package
java -jar -Dspring.profiles.active=dev target/*.jarEnvironment variables alternative
If you prefer not to use the application-dev.properties file, you can provide DB credentials through environment variables instead. The application reads the following variables (if set):
SPRING_DATASOURCE_URL(e.g.jdbc:postgresql://localhost:5432/mosaico_db)SPRING_DATASOURCE_USERNAMESPRING_DATASOURCE_PASSWORD
Example (macOS / zsh):
export SPRING_DATASOURCE_URL=jdbc:postgresql://localhost:5432/mosaico_db
export SPRING_DATASOURCE_USERNAME=mosaico
export SPRING_DATASOURCE_PASSWORD=change_me
./mvnw -DskipTests spring-boot:runSecurity note
- Do NOT commit production credentials into repository files.
application-dev.propertiesis intended for local development examples only. For CI and production, prefer using environment variables or secret management (GitHub/GitLab CI secrets, HashiCorp Vault, etc.).
CI / Production notes
- In CI pipelines, inject the required
SPRING_DATASOURCE_*environment variables or provide a secured properties file via the pipeline's secret mechanism. Avoid committing secrets in the repo.
This project previously included Flyway migrations, but Flyway is disabled by default for the application in this branch because the target database is empty and automatic migrations are not required. If you need to enable Flyway:
- Set the environment variable
SPRING_FLYWAY_ENABLED=truefor the application container, or - Add
spring.flyway.enabled=trueto the active profile (for exampleapplication-dev.properties).
If you do enable Flyway, make sure its migration locations do not conflict with any other service sharing the same database (e.g., Langfuse). Alternatively, keep using a dedicated database for this app.
If you want, I can:
- add a small
READMEwarning header intoapplication-dev.propertiesso it's clear it's for local use only, and/or - add a
.env.examplefile showing the environment variables you should set locally.
This branch includes two conveniences for local development and the Mongo->JPA migration:
-
Database init script:
docker/postgres-init/01-create-mosaico-db.sqlis mounted into the Postgres service under/docker-entrypoint-initdb.d. On a fresh Postgres initialization (when the volume is empty) the script will create themosaico_dbdatabase automatically. If you already have alangfuse_pg_datavolume, the script won't run — delete the volume to re-run initialization or create the DB manually:# manual create (non destructive) docker exec -it langfuse-postgres psql -U $POSTGRES_USER -c "CREATE DATABASE mosaico_db;"
-
JSON/JSONB mapping: several entities in the codebase use flexible JSON-like structures (for example
Map<String,Object>orList<Map<String,Object>>). To support persisting these fields in Postgres we added a small Jackson-based JPAAttributeConverterlocated atsrc/main/java/.../converter/JsonAttributeConverter.javaand annotated the corresponding fields with@Convertand@Column(columnDefinition = "jsonb")so Hibernate will persist them as JSONB columns. This keeps the migration fast and avoids creating many small normalized tables for nested, loosely-typed data.
Notes on the JSON mapping:
- The converter stores arbitrary JSON-serialisable objects as JSON in the DB and deserialises them back to Java types (
Map,List, primitives) when read. These values are treated as opaque JSON data by JPA (they are not managed entities). - In dev the project uses
spring.jpa.hibernate.ddl-auto=update(seeapplication-dev.properties) so the newjsonbcolumns are created automatically when the application starts. In production/CI you should create proper Flyway migrations to add the columns explicitly if you prefer controlled schema management.
If you'd like, I can also:
- make the DB-init script idempotent (check existence before creating), or
- convert more fields from
@Transientto real relationships/embeddables instead of JSON (recommended for long-term data integrity).