-
Notifications
You must be signed in to change notification settings - Fork 2
Conformance Test Suite #237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
e17a24c
adec3a1
4232924
27d4b57
03ce676
8515671
9ca12c4
c12937d
bf42c43
f422509
6e9699f
2f90adc
5955f95
3eb9472
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,245 @@ | ||
| # FIRES Conformance Test Suite | ||
|
|
||
| [](http://www.gnu.org/licenses/agpl-3.0) | ||
|
|
||
| A comprehensive conformance test suite for validating FIRES protocol implementations. This tool helps ensure that FIRES servers correctly implement the protocol for exchanging moderation data. | ||
|
|
||
| ## What is FIRES? | ||
|
|
||
| **FIRES** stands for **F**ediverse **I**ntelligence **R**eplication **E**ndpoint **S**erver. It's a protocol that allows Fediverse instances to exchange lists of recommended actions to take against instances sharing nonconsensual imagery, abuse, or bigotry. | ||
|
|
||
| This conformance suite validates that server implementations correctly follow the [FIRES Protocol Specification](https://fires.fedimod.org/reference/protocol/). | ||
|
|
||
| ## Features | ||
|
|
||
| - Protocol-level validation of FIRES server implementations | ||
| - Support for both local and remote server testing | ||
| - Multiple output formats for CI/CD integration | ||
| - Docker-based distribution for platform independence | ||
| - Selective test execution by suite | ||
| - Built on Vitest for fast, reliable testing | ||
|
|
||
| ## Installation | ||
|
|
||
| ### Via Docker | ||
|
|
||
| ```bash | ||
| # Run conformance tests against a server | ||
| docker run --rm ghcr.io/fedimod/fires-conformance \ | ||
| --url https://your-fires-server.example | ||
|
|
||
| # Run tests with specific output format | ||
| docker run --rm ghcr.io/fedimod/fires-conformance \ | ||
| --url https://your-fires-server.example \ | ||
| --reporter junit \ | ||
| --output-file results.xml | ||
| ``` | ||
|
|
||
| ## Usage | ||
|
|
||
| ### Basic Usage | ||
|
|
||
| Test a FIRES server implementation: | ||
|
|
||
| ```bash | ||
| docker run --rm ghcr.io/fedimod/fires-conformance --url https://fires.example.org | ||
| ``` | ||
|
|
||
| ### Command Line Options | ||
|
|
||
| #### Required Options | ||
|
|
||
| - `--url <url>` - URL of the FIRES server to test | ||
|
|
||
| #### Output Options | ||
|
|
||
| - `--reporter <type>` - Output format for test results | ||
| - `console` (default) - Human-readable console output | ||
| - `junit` - JUnit XML format for CI/CD systems | ||
| - `html` - HTML report file | ||
| - `json` - JSON format for programmatic consumption | ||
|
|
||
| - `--output-file <path>` - Path to write output file (for non-console reporters) | ||
|
|
||
| #### Test Selection | ||
|
|
||
| - `--suites <suites>` - Run only specific test suites (comma-separated) | ||
| - Example: `--suites labels,nodeinfo` | ||
| - Available suites: | ||
| - `labels` - Label endpoint tests | ||
|
StevenLangbroek marked this conversation as resolved.
|
||
| - `datasets` - Dataset endpoint tests (when implemented) | ||
| - `nodeinfo` - NodeInfo endpoint tests | ||
|
|
||
| #### Other Options | ||
|
|
||
| - `--help` - Display help information | ||
| - `--version` - Display version information | ||
| - `--verbose` - Enable verbose output | ||
| - `--no-color` - Disable colored output | ||
|
|
||
| ### Examples | ||
|
|
||
| #### CI/CD Integration | ||
|
|
||
| ```bash | ||
| # Generate JUnit report for CI systems | ||
| docker run --rm ghcr.io/fedimod/fires-conformance \ | ||
| --url https://staging.fires.example.org \ | ||
| --reporter junit \ | ||
| --output-file test-results.xml | ||
| ``` | ||
|
|
||
| #### Selective Testing | ||
|
|
||
| ```bash | ||
| # Test only Labels endpoints | ||
| docker run --rm ghcr.io/fedimod/fires-conformance \ | ||
| --url https://fires.example.org --suites labels | ||
|
|
||
| # Test multiple suites | ||
| docker run --rm ghcr.io/fedimod/fires-conformance \ | ||
| --url https://fires.example.org --suites labels,nodeinfo | ||
| ``` | ||
|
|
||
| #### Docker Examples | ||
|
|
||
| ```bash | ||
| # Test local development server | ||
| docker run --rm --network host \ | ||
| ghcr.io/fedimod/fires-conformance \ | ||
| --url http://localhost:3333 | ||
|
|
||
| # Generate JUnit report for CI | ||
| docker run --rm -v $(pwd):/output \ | ||
| ghcr.io/fedimod/fires-conformance \ | ||
| --url https://fires.example.org \ | ||
| --reporter junit \ | ||
| --output-file /output/results.xml | ||
|
|
||
| # Test with verbose output | ||
| docker run --rm \ | ||
| ghcr.io/fedimod/fires-conformance \ | ||
| --url https://fires.example.org \ | ||
| --verbose | ||
| ``` | ||
|
|
||
| ## What Gets Tested | ||
|
|
||
| The conformance suite validates: | ||
|
|
||
| ### Labels Endpoint | ||
| - Collection endpoint (`/labels`) returns valid JSON-LD | ||
| - Individual label endpoints (`/labels/:id`) return valid JSON-LD | ||
|
StevenLangbroek marked this conversation as resolved.
|
||
| - Pagination behavior | ||
| - Label structure and required fields | ||
|
ThisIsMissEm marked this conversation as resolved.
|
||
| - Linked data context validity | ||
|
|
||
| ### NodeInfo Endpoint | ||
| - Well-known discovery endpoint (`/.well-known/nodeinfo`) | ||
| - NodeInfo 2.1 endpoint structure | ||
| - Required metadata fields | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. These are just |
||
| - Protocol identification | ||
|
|
||
| ### Datasets Endpoint | ||
| - Dataset collection endpoints | ||
| - Individual dataset retrieval | ||
| - Resumable data transfer | ||
| - Change tracking | ||
|
|
||
| ## CI/CD Integration | ||
|
|
||
| ### GitHub Actions | ||
|
|
||
| ```yaml | ||
| name: FIRES Conformance | ||
| on: [push, pull_request] | ||
|
|
||
| jobs: | ||
| conformance: | ||
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Start FIRES server | ||
| run: | | ||
| # Your server startup logic here | ||
| docker compose up -d | ||
|
|
||
| - name: Wait for server | ||
| run: | | ||
| timeout 60 bash -c 'until curl -f http://localhost:3333/nodeinfo/2.1; do sleep 2; done' | ||
|
ThisIsMissEm marked this conversation as resolved.
|
||
|
|
||
| - name: Run conformance tests | ||
| run: | | ||
| docker run --rm --network host \ | ||
| ghcr.io/fedimod/fires-conformance \ | ||
| --url http://localhost:3333 \ | ||
| --reporter junit \ | ||
| --output-file results.xml | ||
|
|
||
| - name: Publish test results | ||
| uses: EnricoMi/publish-unit-test-result-action@v2 | ||
| if: always() | ||
|
ThisIsMissEm marked this conversation as resolved.
|
||
| with: | ||
| files: results.xml | ||
| ``` | ||
|
|
||
| ## Development | ||
|
|
||
| ### Running Tests Locally | ||
|
|
||
| ```bash | ||
| # Install dependencies | ||
| pnpm install | ||
|
|
||
| # Run tests against a local server | ||
| pnpm test -- --url http://localhost:3333 | ||
|
|
||
| # Run tests with coverage | ||
| pnpm test:coverage | ||
|
|
||
| # Run in watch mode during development | ||
| pnpm test:watch | ||
| ``` | ||
|
|
||
| ### Project Structure | ||
|
|
||
| Tests are organized by suite in separate directories for selective execution: | ||
|
|
||
| ``` | ||
| components/conformance/ | ||
| ├── src/ | ||
| │ ├── tests/ | ||
| │ │ ├── labels/ # Label endpoint tests (--suites labels) | ||
| │ │ ├── datasets/ # Dataset endpoint tests (--suites datasets) | ||
| │ │ └── nodeinfo/ # NodeInfo tests (--suites nodeinfo) | ||
| │ └── cli.ts # CLI interface | ||
| ├── package.json | ||
|
ThisIsMissEm marked this conversation as resolved.
|
||
| ├── vitest.config.ts | ||
| ├── Dockerfile | ||
| └── README.md | ||
| ``` | ||
|
|
||
| The CLI maps `--suites` options to specific test directories, allowing Vitest to run only the relevant test files. | ||
|
|
||
| ## API Design Stability | ||
|
|
||
| The conformance suite exposes a deliberately limited CLI interface to maintain API stability. While the test suite is built on Vitest, we don't expose all Vitest options directly to avoid locking ourselves into Vitest-specific features as part of the public API. | ||
|
|
||
| If you need additional testing capabilities not covered by the current CLI options, please [open an issue](https://github.com/fedimod/fires/issues) to discuss your use case. | ||
|
|
||
| ## Contributing | ||
|
|
||
| Contributions are welcome! When adding new tests: | ||
|
|
||
| 1. Place tests in the appropriate suite directory for selective execution | ||
| 2. Follow existing test structure and naming conventions | ||
| 3. Document any new command-line options | ||
| 4. Update this README with examples | ||
| 5. Ensure tests work in Docker environments | ||
|
|
||
| ## License | ||
|
|
||
| This project is licensed under the AGPL-3.0 License. | ||
|
|
||
| ## Acknowledgements | ||
|
|
||
| See [Acknowledgements](/README.md#acknowledgements) in the main FIRES repository. | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,22 @@ | ||
| { | ||
| "name": "@fedimod/fires-conformance", | ||
| "version": "0.0.0", | ||
| "description": "", | ||
| "main": "index.js", | ||
| "scripts": { | ||
| "test": "ts-node src/cli.ts" | ||
| }, | ||
| "keywords": [], | ||
| "author": "", | ||
| "license": "AGPL-3.0", | ||
| "packageManager": "pnpm@10.9.0", | ||
| "dependencies": { | ||
| "@optique/core": "^0.8.0", | ||
| "@optique/run": "^0.8.0", | ||
| "bcp-47": "^2.1.0", | ||
| "jsonld": "^9.0.0", | ||
| "ts-node-maintained": "^10.9.6", | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This should probably be a dev dep
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should it? It's a dependency for running the tool, no, through cli.js?
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nope, as the distribution would be compiled. |
||
| "typescript": "~5.7.3", | ||
| "vitest": "^4.0.14" | ||
| } | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,126 @@ | ||
| #!/usr/bin/env node | ||
|
|
||
| import { merge, object, conditional } from "@optique/core/constructs"; | ||
| import { optional, withDefault } from "@optique/core/modifiers"; | ||
| import { option } from "@optique/core/primitives"; | ||
| import { choice, string, url } from "@optique/core/valueparser"; | ||
| import type { ValueParser, ValueParserResult } from "@optique/core/valueparser"; | ||
| import { message, values } from "@optique/core/message"; | ||
| import { run } from "@optique/run"; | ||
| import { startVitest } from "vitest/node"; | ||
|
|
||
| const AVAILABLE_SUITES = ["labels", "datasets", "nodeinfo"]; | ||
|
|
||
| // Custom value parser for comma-separated suite list | ||
| function commaSeparatedSuites(): ValueParser<string[]> { | ||
| return { | ||
| metavar: "SUITE[,SUITE...]", | ||
| parse(input: string): ValueParserResult<string[]> { | ||
| const suites = input | ||
| .split(",") | ||
| .map((s) => s.trim()) | ||
| .filter((s) => s.length > 0); | ||
| const invalidSuites = suites.filter((s) => !AVAILABLE_SUITES.includes(s)); | ||
|
|
||
| if (invalidSuites.length > 0) { | ||
| return { | ||
| success: false, | ||
| error: message`Invalid suite(s): ${values(invalidSuites)}. Available: ${values(AVAILABLE_SUITES)}`, | ||
| }; | ||
| } | ||
|
|
||
| return { success: true, value: suites }; | ||
| }, | ||
| format(suites: string[]): string { | ||
| return suites.join(","); | ||
| }, | ||
| }; | ||
| } | ||
|
|
||
| const consoleParser = object({ | ||
| noColor: withDefault(option("--no-color"), false), | ||
| }); | ||
|
|
||
| // Base options shared by all configurations | ||
| const parser = object({ | ||
| url: option("--url", url()), | ||
| suites: optional(option("--suites", commaSeparatedSuites())), | ||
| verbose: withDefault(option("--verbose"), false), | ||
| reporter: conditional( | ||
| option("--reporter", choice(["console", "junit", "html", "json"])), | ||
| { | ||
| console: consoleParser, | ||
| junit: object({ | ||
| outputFile: option("--output-file", string()), | ||
| }), | ||
| html: object({ | ||
| outputFile: option("--output-file", string()), | ||
| }), | ||
| json: object({ | ||
| outputFile: option("--output-file", string()), | ||
| }), | ||
| }, | ||
| consoleParser, | ||
| ), | ||
| }); | ||
|
|
||
| async function main() { | ||
| const pkg = require("../package.json"); | ||
|
|
||
| // Run the parser with optique | ||
| const options = run(parser, { | ||
| programName: "fires-conformance", | ||
| help: "option", | ||
| version: pkg.version, | ||
| }); | ||
|
|
||
| // Convert suites to test directories if provided | ||
| const testDirs = options.suites?.map((suite) => `src/tests/${suite}`); | ||
|
|
||
| // Transform options for Vitest | ||
| const vitestConfig: any = { | ||
| run: true, | ||
| mode: "test", | ||
| env: { | ||
| FIRES_SERVER_URL: options.url.href, | ||
| }, | ||
| }; | ||
|
|
||
| const [reporter, reporterConfig] = options.reporter; | ||
|
|
||
| if (reporter === "console" || typeof reporter === "undefined") { | ||
| vitestConfig.color = !reporterConfig.noColor; | ||
| } | ||
|
|
||
| if (reporter === "html" || reporter === "json" || reporter === "junit") { | ||
| // File-based reporter (junit/html/json) | ||
| vitestConfig.reporters = [reporter]; | ||
| vitestConfig.outputFile = reporterConfig.outputFile; | ||
| } | ||
|
|
||
| // Configure logging level | ||
| if (options.verbose) { | ||
| vitestConfig.logLevel = "info"; | ||
| } | ||
|
|
||
| // Run Vitest programmatically | ||
| const vitest = await startVitest("test", testDirs || [], vitestConfig); | ||
|
|
||
| if (!vitest) { | ||
| console.error("Failed to start Vitest"); | ||
| process.exit(1); | ||
| } | ||
|
|
||
| // Exit with appropriate code based on test results | ||
| const hasFailures = | ||
| vitest.state.getUnhandledErrors().length > 0 || | ||
| vitest.state.getCountOfFailedTests() > 0; | ||
|
|
||
| await vitest.close(); | ||
| process.exit(hasFailures ? 1 : 0); | ||
| } | ||
|
|
||
| main().catch((error) => { | ||
| console.error("Fatal error:", error); | ||
| process.exit(1); | ||
| }); |
Uh oh!
There was an error while loading. Please reload this page.