Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,7 @@ node_modules/
.env
audit-logs/
credentials/
deliverables/
dist/
local/
repos/
61 changes: 57 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Shannon is developed by [Keygraph](https://keygraph.io) and available in two edi
> **This repository contains Shannon Lite,** the core autonomous AI pentesting framework. **Shannon Pro** is Keygraph's all-in-one AppSec platform, combining SAST, SCA, secrets scanning, business logic security testing, and autonomous AI pentesting in a single correlated workflow. Every finding is validated with a working proof-of-concept exploit.

> [!IMPORTANT]
> **White-box only.** Shannon Lite is designed for **white-box (source-available)** application security testing.
> **White-box only.** Shannon Lite is designed for **white-box (source-available)** application security testing.
> It expects access to your application's source code and repository layout.

### Shannon Pro: Architecture Overview
Expand Down Expand Up @@ -391,7 +391,7 @@ pipeline:

Shannon also supports [Amazon Bedrock](https://aws.amazon.com/bedrock/) instead of using an Anthropic API key.

#### Quick Setup
#### Quick Setup (Bedrock API key)

1. Add your AWS credentials to `.env`:

Expand All @@ -412,7 +412,60 @@ ANTHROPIC_LARGE_MODEL=us.anthropic.claude-opus-4-6
./shannon start URL=https://example.com REPO=repo-name
```

Shannon uses three model tiers: **small** (`claude-haiku-4-5-20251001`) for summarization, **medium** (`claude-sonnet-4-6`) for security analysis, and **large** (`claude-opus-4-6`) for deep reasoning. Set `ANTHROPIC_SMALL_MODEL`, `ANTHROPIC_MEDIUM_MODEL`, and `ANTHROPIC_LARGE_MODEL` to the Bedrock model IDs for your region.
Shannon uses three model tiers: **small** (`claude-haiku-4-5-20251001`) for summarization, **medium** (`claude-sonnet-4-6`) for security analysis, and **large** (`claude-opus-4-6`) for deep reasoning. Set `ANTHROPIC_SMALL_MODEL`, `ANTHROPIC_MEDIUM_MODEL`, and `ANTHROPIC_LARGE_MODEL` to the Bedrock model IDs for your region, or to an [application inference profile ARN](https://docs.aws.amazon.com/bedrock/latest/userguide/applications.html) if you are using Bedrock applications.

#### Using AWS SSO / IAM credentials (no Bedrock API key)

Some organizations prefer to use AWS IAM Identity Center (SSO) or short-lived STS credentials instead of Bedrock API keys. Shannon forwards standard AWS credential environment variables through to Claude Code, so you can:

**Option A: Manual Setup**

1. Log in with AWS SSO and obtain temporary credentials:

```bash
aws sso login --profile <your-profile-name>
aws configure export-credentials --profile <your-profile-name> --format env
```

2. Export those credentials into your shell before running Shannon:

```bash
CLAUDE_CODE_USE_BEDROCK=1
AWS_REGION=us-west-2

# Point model tiers at your Bedrock model or application inference profile ARN
ANTHROPIC_SMALL_MODEL=arn:aws:bedrock:us-west-2:123456789012:application-inference-profile/your-profile
ANTHROPIC_MEDIUM_MODEL=arn:aws:bedrock:us-west-2:123456789012:application-inference-profile/your-profile
ANTHROPIC_LARGE_MODEL=arn:aws:bedrock:us-west-2:123456789012:application-inference-profile/your-profile

# Short-lived AWS credentials from SSO / STS
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...
AWS_SESSION_TOKEN=...
AWS_DEFAULT_REGION=us-west-2
```

**Option B: Automated Script (Recommended)**

Use the provided helper script to automatically fetch and export AWS credentials via SSO into your current shell:

```bash
# Source the script to load temporary credentials into your shell
# (must use 'source', not './script.sh')
source ./scripts/bedrock-sso-login.sh <your-profile-name>

# The script will:
# 1. Log in with AWS SSO (if not already cached)
# 2. Export temporary credentials into your current shell
# 3. Display confirmation message

# Now run Shannon as usual (credentials are already in your shell)
./shannon start URL=https://example.com REPO=repo-name
```

**Note:** The script must be **sourced** (not executed) so that the exported AWS credentials (`AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, `AWS_SESSION_TOKEN`, `AWS_DEFAULT_REGION`) remain in your current shell session. If no profile name is provided, it fallback to default profile.

Claude Code will then authenticate to Bedrock using these IAM credentials while still respecting the model IDs you configured for each tier.

### Google Vertex AI

Expand Down Expand Up @@ -697,7 +750,7 @@ Book a free 15-min session for hands-on help with bugs, deployments, or config q

💬 [Join our Discord](https://discord.gg/cmctpMBXwE) to ask questions, share feedback, and connect with other Shannon users.

**Contributing:** At this time, we're not accepting external code contributions (PRs).
**Contributing:** At this time, we're not accepting external code contributions (PRs).
Issues are welcome for bug reports and feature requests.

- 🐛 **Report bugs** via [GitHub Issues](https://github.com/KeygraphHQ/shannon/issues)
Expand Down
10 changes: 9 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,15 @@ services:
- CLAUDE_CODE_OAUTH_TOKEN=${CLAUDE_CODE_OAUTH_TOKEN:-}
- CLAUDE_CODE_USE_BEDROCK=${CLAUDE_CODE_USE_BEDROCK:-}
- AWS_REGION=${AWS_REGION:-}
- AWS_BEARER_TOKEN_BEDROCK=${AWS_BEARER_TOKEN_BEDROCK:-}
# NOTE: Only pass AWS_BEARER_TOKEN_BEDROCK when it is actually set.
# An empty value causes Claude Code to attempt bearer-token auth with a
# blank token instead of falling back to SigV4 IAM credentials.
# Do NOT pass AWS_PROFILE — the container has no ~/.aws/config,
# and the AWS SDK prefers profile over env-var creds, causing auth failure.
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID:-}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY:-}
- AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN:-}
- AWS_DEFAULT_REGION=${AWS_DEFAULT_REGION:-}
- CLAUDE_CODE_USE_VERTEX=${CLAUDE_CODE_USE_VERTEX:-}
- CLOUD_ML_REGION=${CLOUD_ML_REGION:-}
- ANTHROPIC_VERTEX_PROJECT_ID=${ANTHROPIC_VERTEX_PROJECT_ID:-}
Expand Down
24 changes: 9 additions & 15 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
},
"devDependencies": {
"@types/js-yaml": "^4.0.9",
"@types/node": "^25.0.3",
"@types/node": "^25.3.3",
"typescript": "^5.9.3"
}
}
54 changes: 54 additions & 0 deletions scripts/bedrock-sso-login.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
#!/usr/bin/env bash
# Helper script to log in with AWS SSO for a given profile and
# export short-lived credentials into the *current* shell.
#
# IMPORTANT: You must **source** this script so that the exported
# variables remain in your shell:
#
# source ./scripts/bedrock-sso-login.sh <profile-name>

# Ensure aws CLI is available
if ! command -v aws >/dev/null 2>&1; then
echo "[bedrock-sso-login] Error: 'aws' CLI not found in PATH." >&2
echo "Install AWS CLI v2 and try again." >&2
return 1 2>/dev/null || exit 1
fi

# Determine profile to use
if [[ -n "$1" ]]; then
PROFILE="$1"
else
PROFILE=$(aws configure get profile 2>/dev/null)
if [[ -z "$PROFILE" ]]; then
echo "[bedrock-sso-login] Error: No AWS profile provided and no default profile configured." >&2
fi
fi

# Validate that the profile exists
if ! aws configure get sso_start_url --profile "$PROFILE" >/dev/null 2>&1; then
echo "[bedrock-sso-login] Error: AWS profile '$PROFILE' not found or not configured for SSO." >&2
echo "" >&2
echo "Available profiles:" >&2
aws configure list-profiles 2>/dev/null | sed 's/^/ /'
return 1 2>/dev/null || exit 1
fi

# Perform SSO login (no-op if already logged in and cached)
echo "[bedrock-sso-login] Using AWS profile: $PROFILE"
echo "[bedrock-sso-login] Logging in with AWS SSO..."
aws sso login --profile "$PROFILE"

# Export short-lived credentials into the current shell.
# Requires AWS CLI v2.15+.
echo "[bedrock-sso-login] Exporting temporary credentials into current shell..."
_bedrock_creds="$(aws configure export-credentials --profile "$PROFILE" --format env 2>&1)" || {
echo "[bedrock-sso-login] Error: failed to export credentials." >&2
echo "$_bedrock_creds" >&2
echo "Make sure you have AWS CLI v2.15+ installed." >&2
unset _bedrock_creds
return 1 2>/dev/null || exit 1
}
eval "$_bedrock_creds"
unset _bedrock_creds

echo "[bedrock-sso-login] Temporary AWS credentials loaded for profile '$PROFILE'."
15 changes: 14 additions & 1 deletion shannon
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,13 @@ ensure_containers() {
docker compose -f "$COMPOSE_FILE" $COMPOSE_OVERRIDE up -d worker 2>/dev/null || true
fi

# When using Bedrock with SSO/IAM creds (short-lived), always recreate the worker
# so it picks up the latest AWS_ACCESS_KEY_ID / AWS_SESSION_TOKEN from the shell.
if [ "$CLAUDE_CODE_USE_BEDROCK" = "1" ] && [ -n "$AWS_ACCESS_KEY_ID" ]; then
echo "Bedrock SSO mode: refreshing worker to pick up current AWS credentials..."
docker compose -f "$COMPOSE_FILE" $COMPOSE_OVERRIDE up -d --force-recreate worker 2>/dev/null || true
fi

# Quick check: if Temporal is already healthy, we're good
if is_temporal_ready; then
return 0
Expand Down Expand Up @@ -148,14 +155,20 @@ cmd_start() {
# Bedrock mode — validate required AWS credentials
MISSING=""
[ -z "$AWS_REGION" ] && MISSING="$MISSING AWS_REGION"
[ -z "$AWS_BEARER_TOKEN_BEDROCK" ] && MISSING="$MISSING AWS_BEARER_TOKEN_BEDROCK"
[ -z "$ANTHROPIC_SMALL_MODEL" ] && MISSING="$MISSING ANTHROPIC_SMALL_MODEL"
[ -z "$ANTHROPIC_MEDIUM_MODEL" ] && MISSING="$MISSING ANTHROPIC_MEDIUM_MODEL"
[ -z "$ANTHROPIC_LARGE_MODEL" ] && MISSING="$MISSING ANTHROPIC_LARGE_MODEL"
if [ -n "$MISSING" ]; then
echo "ERROR: Bedrock mode requires the following env vars in .env:$MISSING"
exit 1
fi
# Require either a Bedrock API key or SSO/IAM credentials
if [ -z "$AWS_BEARER_TOKEN_BEDROCK" ] && [ -z "$AWS_ACCESS_KEY_ID" ]; then
echo "ERROR: Bedrock mode requires either AWS_BEARER_TOKEN_BEDROCK (API key)"
echo " or AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY + AWS_SESSION_TOKEN (SSO/IAM)"
echo " For SSO: source ./bedrock-sso-login.sh <profile-name>"
exit 1
fi
elif [ "$CLAUDE_CODE_USE_VERTEX" = "1" ]; then
# Vertex AI mode — validate required GCP credentials
MISSING=""
Expand Down
44 changes: 25 additions & 19 deletions src/ai/claude-executor.ts
Original file line number Diff line number Diff line change
Expand Up @@ -224,28 +224,34 @@ export async function runClaudePrompt(
const mcpServers = buildMcpServers(sourceDir, agentName, logger);

// 4. Build env vars to pass to SDK subprocesses
// Start with the full parent environment so the subprocess has PATH, HOME,
// and other vars the AWS SDK credential chain needs to function.
const sdkEnv: Record<string, string> = {
...Object.fromEntries(
Object.entries(process.env).filter((entry): entry is [string, string] => entry[1] != null)
),
CLAUDE_CODE_MAX_OUTPUT_TOKENS: process.env.CLAUDE_CODE_MAX_OUTPUT_TOKENS || '64000',
};
const passthroughVars = [
'ANTHROPIC_API_KEY',
'CLAUDE_CODE_OAUTH_TOKEN',
'ANTHROPIC_BASE_URL',
'ANTHROPIC_AUTH_TOKEN',
'CLAUDE_CODE_USE_BEDROCK',
'AWS_REGION',
'AWS_BEARER_TOKEN_BEDROCK',
'CLAUDE_CODE_USE_VERTEX',
'CLOUD_ML_REGION',
'ANTHROPIC_VERTEX_PROJECT_ID',
'GOOGLE_APPLICATION_CREDENTIALS',
'ANTHROPIC_SMALL_MODEL',
'ANTHROPIC_MEDIUM_MODEL',
'ANTHROPIC_LARGE_MODEL',
];
for (const name of passthroughVars) {
if (process.env[name]) {
sdkEnv[name] = process.env[name]!;

// When using Bedrock with IAM/SSO credentials, remove empty auth vars
// that would cause Claude Code to attempt the wrong auth method.
// - Empty ANTHROPIC_API_KEY → Claude Code tries API-key auth with blank key
// - Empty AWS_BEARER_TOKEN_BEDROCK → tries bearer-token auth with blank token
// - AWS_PROFILE → tries profile-based auth without ~/.aws/config
// - Empty CLAUDE_CODE_OAUTH_TOKEN → may interfere with provider selection
if (sdkEnv.CLAUDE_CODE_USE_BEDROCK === '1' && sdkEnv.AWS_ACCESS_KEY_ID) {
const authVarsToClean = [
'ANTHROPIC_API_KEY',
'ANTHROPIC_AUTH_TOKEN',
'ANTHROPIC_BASE_URL',
'AWS_BEARER_TOKEN_BEDROCK',
'AWS_PROFILE',
'CLAUDE_CODE_OAUTH_TOKEN',
'CLAUDE_CODE_USE_VERTEX',
'ANTHROPIC_VERTEX_PROJECT_ID',
];
for (const key of authVarsToClean) {
if (!sdkEnv[key]) delete sdkEnv[key];
}
}

Expand Down
18 changes: 16 additions & 2 deletions src/services/preflight.ts
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ async function validateCredentials(

// 2. Bedrock mode — validate required AWS credentials are present
if (process.env.CLAUDE_CODE_USE_BEDROCK === '1') {
const required = ['AWS_REGION', 'AWS_BEARER_TOKEN_BEDROCK', 'ANTHROPIC_SMALL_MODEL', 'ANTHROPIC_MEDIUM_MODEL', 'ANTHROPIC_LARGE_MODEL'];
const required = ['AWS_REGION', 'ANTHROPIC_SMALL_MODEL', 'ANTHROPIC_MEDIUM_MODEL', 'ANTHROPIC_LARGE_MODEL'];
const missing = required.filter(v => !process.env[v]);
if (missing.length > 0) {
return err(
Expand All @@ -181,7 +181,21 @@ async function validateCredentials(
)
);
}
logger.info('Bedrock credentials OK');
// Require either a Bedrock API key or SSO/IAM credentials
const hasBearerToken = !!process.env.AWS_BEARER_TOKEN_BEDROCK;
const hasIamCreds = !!process.env.AWS_ACCESS_KEY_ID && !!process.env.AWS_SECRET_ACCESS_KEY;
if (!hasBearerToken && !hasIamCreds) {
return err(
new PentestError(
'Bedrock mode requires either AWS_BEARER_TOKEN_BEDROCK (API key) or AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY + AWS_SESSION_TOKEN (SSO/IAM). For SSO: source ./scripts/bedrock-sso-login.sh <profile-name>',
'config',
false,
{},
ErrorCode.AUTH_FAILED
)
);
}
logger.info(`Bedrock credentials OK (${hasBearerToken ? 'API key' : 'IAM/SSO'})`);
return ok(undefined);
}

Expand Down