Security Operations
ArcScan Enterprise
Alerts
Run scan

Help Center

Step-by-step guides for every ArcScan feature. FAQ →

Local / Dev Setup
Run ArcScan on your machine for development or testing
1
Clone the repo
git clone https://github.com/arcusautomate/AnsibleAuditor.git cd AnsibleAuditor
2
Create a virtual environment
python3 -m venv venv source venv/bin/activate pip install -r requirements.txt -c constraints.txt
3
Set environment variables
Create a .env file (or export in your shell):
export DJANGO_SECRET_KEY=$(python -c "import secrets; print(secrets.token_urlsafe(50))") export ANTHROPIC_API_KEY=sk-ant-your-key-here # or OPENAI_API_KEY export DEBUG=True
4
Run migrations and start
python manage.py migrate python manage.py createsuperuser python manage.py runserver
Open http://localhost:8000 in your browser.
Tip: For background tasks (scheduled scans, async reports), also run Celery: celery -A playbookscan worker --loglevel=info
Your First Scan
Run an Ansible playbook audit in under 60 seconds
1
Open the Scan page
Click Security → Scan Playbook in the top navigation, or press ⚡ New Scan from the dashboard.
2
Paste or upload your playbook
Paste YAML directly into the editor, or click Upload File to select a .yml or .yaml file. You can also drag and drop.
3
Click Scan
ArcScan sends the content to your configured AI provider. The audit typically completes in 5–20 seconds.
4
Review your report
Your score (0–100) and grade (A–F) appear at the top. Scroll through findings sorted by severity — Critical → Warning → Info → Pass.
No API key yet? Go to Settings and add your Anthropic or OpenAI key first. See Add an AI API key.
Add an AI API Key
Connect Anthropic, OpenAI, Gemini, DeepSeek, or Nvidia
1
Open Settings
2
Choose your provider
Select from Anthropic OpenAI Gemini DeepSeek Nvidia. Anthropic Claude is recommended for best audit quality.
3
Paste your API key
Get your key from the provider's dashboard (e.g. console.anthropic.com → API Keys). Paste it into the API Key field and click Save.
4
Test it
Run a quick scan — if the key is valid you'll see a report within seconds. Keys are encrypted at rest and never shared.
Custom endpoint? Enterprise users can enter a custom base URL (e.g. Azure OpenAI, local Ollama) in the Custom Endpoint field below the key.
Account & Profile
Email preferences, report delivery, and weekly digest
2
Set your report email
Enter the email address to receive scan reports and alerts. Defaults to your account email if left blank.
3
Enable weekly digest
Toggle Weekly security digest to receive a Monday morning summary: scan count, posture score, open critical findings. Sent only if you had scans in the past 30 days.
4
Theme
Click any colour swatch in Settings → Theme (or via the top-right Settings menu) to switch colour scheme. Saved to your browser.
Enable Two-Factor Auth
TOTP app, SMS, or email verification
2
Choose a method
Authenticator App (recommended — Google Authenticator, Authy, 1Password) · SMS · Email OTP
3
Scan QR code (TOTP)
Open your authenticator app, scan the QR code shown, then enter the 6-digit code to confirm. Save your backup codes in a secure location.
4
Confirm & activate
Enter the code and click Verify. 2FA is now active — you'll be challenged on every new login.
Lost your device? Use a backup code to sign in, then reconfigure 2FA. Contact your org admin if backup codes are also lost.
Scan a Playbook
Full walkthrough of the playbook audit tab
1
Paste, upload, or pull from storage
The Audit tab accepts: raw YAML paste · file upload · cloud storage files (Drive, GitHub, GitLab). Switch tabs to choose your input method.
2
Select a scan type
Choose Ansible Playbook, Kubernetes, Dockerfile, GitHub Actions, etc. — each uses a purpose-built AI prompt.
3
Click Scan & watch the stream
Results stream in real time. The AI checks: hardcoded secrets · missing no_log · non-idempotent shell/command · deprecated modules · state:latest · broad become · missing tags · Jinja2 issues · and 10+ more.
4
Review the score card
Score 0–100 · Grade A–F · Critical/Warning/Info/Pass counts · Executive summary · Per-finding detail with line hints and fix snippets.
Finding IDs follow the format ANSI-001 and map to ansible-lint rule IDs where applicable. Click a finding to expand the full fix snippet.
Reading a Report
Score, grades, findings, compliance, threat model, and SBOM
Score (0–100)Weighted deduction per severity. 100 = no issues found.
Grade (A–F)A≥90 · B≥75 · C≥60 · D≥40 · F<40
CriticalSecurity risks that must be fixed before production.
WarningBest-practice gaps that will cause reliability issues.
InfoNon-blocking suggestions for improvement.
PassChecks your playbook passed — confirms coverage.

Compliance tab — per-framework scoring (CIS, NIST, SOC 2, FedRAMP) with control-level pass/fail breakdowns. Available on Pro+.
Threat Model tab — AI-generated attack vectors with MITRE ATT&CK references and likelihood/impact ratings.
SBOM — Software bill of materials listing Ansible collections, roles, and packages. Download as CycloneDX JSON.
Dep Graph — Visual map of play → role → task dependencies. Click Expand to render.
Blast Radius — Estimated host count, target groups, and environment impact of the playbook.

Auto-Fix Findings
Let Claude patch your playbook and open a remediation PR
1
Open a report with findings
Navigate to any report with Critical or Warning findings.
2
Click ⚡ Auto-Fix
ArcScan sends all findings + the original YAML to Claude with a remediation prompt. The AI returns a fully patched playbook with # FIXED: comments on each change.
3
Review the diff
The diff view highlights every added/removed line in green/red. Scroll through before accepting.
4
Download or open a PR
Click Download Fixed to get the patched YAML, or click Open PR (requires GitHub/GitLab token in Settings) to automatically push a remediation branch and open a pull request.
Pro tip: Run a second scan on the fixed file to verify your score improved. Use ⇄ Compare to view the diff between original and fixed risks.
Multi-File / ZIP Scan
Audit an entire Ansible project or role directory
1
Zip your project
Create a .zip of your Ansible directory. Recommended structure: roles/, playbooks/, inventory/. Max 10MB.
2
Switch to Multi-file tab
On the Scan page, click the Multi-file / Zip Scan tab and upload your archive.
3
Review results per file
Each YAML file is scanned individually. The results page shows a consolidated score plus per-file breakdowns with individual finding lists.
Schedule Recurring Scans
Automatically re-scan playbooks daily or weekly
1
Open Scheduled Scans
2
Set frequency and source
Choose Daily or Weekly. Attach a stored playbook from the library, or paste inline YAML.
3
Configure notifications
Enable email notification on new critical findings. Optionally attach a webhook (Slack, Teams) to post results automatically.
4
Regression detection
If the score drops more than 10 points between runs, ArcScan flags a regression and sends an alert regardless of your notification settings.
Compare Two Reports
See which findings were fixed, introduced, or persisted
1
From a report
Open any report and click the ⇄ Compare button. The compare page loads with that report pre-selected as Report A.
2
From the Compare page
Go to Insights → All Reports → Compare, then select two reports from the dropdowns.
3
Read the delta
Fixed findings were resolved. New findings appeared. Persisted findings remain unchanged. Score delta shows improvement or regression.
Connect a Cloud Account
AWS, Azure, and GCP credential setup
1
Open Cloud Connect
2
Select your provider
AWS — Enter Access Key ID and Secret Key. Minimum IAM permissions: ReadOnlyAccess or a custom policy with EC2/S3/RDS/IAM Describe permissions.
Azure — Service principal credentials: Tenant ID, Client ID, Client Secret, Subscription ID. Assign Reader role.
GCP — Service account JSON key. Assign Viewer role at the project level.
3
Give it a label and save
Credentials are encrypted with Fernet symmetric encryption. They never appear in logs or AI prompts.
4
Run your first inventory pull
Go to Cloud → Inventory and click Refresh to pull live resource data from your account.
Least privilege: ArcScan only needs read access. Never grant write or admin permissions to the scan credential.
Security Posture Scan
CIS-aligned deterministic checks with AI remediation
1
Open Security Posture
2
Select credential and scan
Choose your cloud account from the dropdown and click Run Scan. Deterministic checks complete in seconds — no AI call, no cost.
3
Review findings
Each finding shows: Severity · CIS control reference · Resource affected · Why it's a risk · Remediation steps · Suggested Ansible module.
4
Generate remediation playbook (optional)
Click Generate Remediation to have Claude write an Ansible playbook targeting all critical findings. Requires AI API key.
Cost Anomaly Detection
Spot unexpected spend spikes in your cloud accounts
1
Connect a cloud account
Cost data requires a connected account. See Connect a Cloud Account.
3
Review flagged services
ArcScan compares current spend against your 30-day rolling baseline. Services spending 2× or more above baseline are flagged with the delta amount.
Drift Detection
Detect when live infrastructure deviates from your baseline
1
Create a baseline
Run an inventory pull, then click Lock as Baseline on the Cloud Inventory page. This snapshots the current resource state.
2
Schedule drift checks
Go to Security → Drift Alerts and enable daily drift comparison. ArcScan compares the live inventory against your locked baseline.
3
Receive drift alerts
New, removed, or changed resources trigger an email + webhook alert with a diff of what changed.
Application Tagging
Map cloud resources to business applications and generate Terraform tags
1
Open a cloud snapshot
Go to Cloud → Inventory and click into any snapshot.
2
Select resources
Use the checkboxes next to EC2 instances, S3 buckets, RDS databases, security groups, etc. The selection bar shows how many are selected.
3
Tag to an application
Click Tag Application. Select an existing application or create a new one (name, owner, environment, criticality, data classification). Click Apply Tags.
4
Generate Terraform (optional)
Click Generate Terraform Tags to produce HCL that applies arcscan:* tags to the selected resources in your real cloud infrastructure. Download the .tf file and run terraform init && terraform apply.
Persistence: Application mappings persist across scans. Once tagged, the association holds until you remove it — even if you re-scan the cloud account.
ITSM Application-Resource Sync
Push app-resource mappings as dependencies to ServiceNow, Jira, and FreshService
When you tag resources to an application, you can optionally push that mapping as dependencies/relationships to your configured ITSM tools.
1
Configure your ITSM integration
Set up at least one integration: ServiceNow (with CMDB push enabled), Jira, or FreshService.
2
Tag resources to an application
In a cloud snapshot, select resources and click Tag Application. When integrations are configured, a Sync to ITSM section appears with checkboxes.
3
Check your targets and apply
Check ServiceNow CMDB, Jira, and/or FreshService, then click Apply Tags. ArcScan tags locally and syncs to the selected tools.
What gets created per tool:
ServiceNow
Application becomes a Business Service CI (cmdb_ci_service). Each resource CI is linked via cmdb_rel_ci "Depends on" relationships.
Jira
Application becomes an Epic. Each tagged resource becomes a Sub-task linked to the Epic.
FreshService
Application becomes an Asset. Each resource becomes a linked asset via the FreshService relationships API.
Idempotent: Re-syncing the same application reuses the existing ITSM objects (same Epic, same Business Service CI) rather than creating duplicates. New resources are added as new relationships.
Generate a Playbook
Describe what you need — Claude writes the YAML
2
Describe your task
Write a plain-English description. Examples: "Install and configure Nginx on RHEL 9 with TLS" or "Create a user, set a password with Ansible Vault, and add to sudoers".
3
Configure options
Set target OS, inventory hosts group, whether to use become, and style (minimal/verbose/best-practice). Best-practice mode adds handlers, tags, and no_log automatically.
4
Generate → scan immediately
Click Scan Generated to immediately audit the output. Generated playbooks typically score A or B — any gaps are shown as findings.
Generate Terraform
Three ways to produce production-ready HCL
ArcScan offers three Terraform generation modes:
A
From Cloud Inventory
Pull your live AWS/Azure/GCP inventory, then click Generate Terraform. ArcScan reverse-engineers your existing infrastructure into HCL modules with proper variable files and outputs.
B
From Description
Describe the infrastructure you want: "3-tier VPC with public/private subnets, ALB, 2 EC2 web servers, RDS MySQL". Claude generates complete, provider-versioned HCL.
C
Import tfstate
Upload a terraform.tfstate file. ArcScan reconstructs the source HCL and audits it for security issues simultaneously.
Access all three modes at Generate → Terraform Dashboard.
CI/CD Workflow Builder
Generate pipeline configs with ArcScan quality gates
1
Open Workflow Builder
2
Choose your CI platform
GitHub Actions GitLab CI Jenkins Azure DevOps AWX/AAP
3
Set quality gate threshold
Enter a minimum score (e.g. 75). The generated pipeline will call the ArcScan API, parse the score, and fail the build if it falls below the threshold.
4
Copy and commit
Copy the generated YAML/Jenkinsfile to your repo. Add ARCSCAN_API_KEY as a CI secret using your token from Settings → API Keys.
PR Auto-Scan
Automatically audit playbooks when a PR is opened on GitHub or GitLab
1
Add your SCM token
Go to Settings → Settings & API Keys. Paste your GitHub Personal Access Token (needs repo scope) or GitLab token (needs api scope).
2
Set a webhook secret (recommended)
Add GITHUB_WEBHOOK_SECRET (or GITLAB_WEBHOOK_SECRET) to your server environment. This is used to verify that webhook payloads come from GitHub/GitLab.
3
Configure the webhook in GitHub
In your GitHub repo: Settings → Webhooks → Add webhook.
Payload URL: https://your-domain/webhooks/github/
Content type: application/json
Secret: your GITHUB_WEBHOOK_SECRET
Events: check Pull requests only.
4
Open a PR
When a PR adds or modifies .yml / .yaml files, ArcScan audits each one (up to 5) and posts a score table comment directly on the PR.
GitLab: Same flow but use /webhooks/gitlab/ as the payload URL and select Merge request events.
ServiceNow Integration
Auto-create incidents, push CIs to CMDB, and sync app dependencies
1
Configure connection
Integrations → ServiceNow. Enter your instance URL (e.g. company.service-now.com), username, and password or OAuth token.
2
Set assignment group and urgency
Map ArcScan severity levels to ServiceNow urgency (1–3). Choose the default assignment group for auto-created incidents.
3
Auto-create incidents from findings
When a scan produces Critical findings, ArcScan can automatically open a ServiceNow incident with the finding detail, remediation steps, and a link to the ArcScan report.
4
CMDB CI push
Enable Push to CMDB in your ServiceNow config. ArcScan creates or updates Configuration Items for each cloud resource using a configurable CMDB table and correlation field.
5
Application dependency sync
When tagging resources to a business application, check ServiceNow CMDB in the Sync to ITSM section. ArcScan creates a Business Service CI for the app and cmdb_rel_ci relationships linking each resource. See ITSM Application-Resource Sync.
Slack & Teams Alerts
Real-time security event notifications
1
Create an Incoming Webhook
Slack: api.slack.com/apps → Your App → Incoming Webhooks → Add New Webhook. Copy the Webhook URL.
Teams: In your Teams channel → Connectors → Incoming Webhook → Configure. Copy the URL.
2
Add in ArcScan
Integrations → SIEM Webhooks → New Webhook. Set provider to Slack or Microsoft Teams, paste the webhook URL, and choose which events to send.
3
Test it
Click Test next to the webhook. A test message appears in your Slack/Teams channel within seconds.
Available events: new scan completed · critical finding detected · account locked · scheduled scan regression · cloud drift alert.
REST API
Programmatic access to scans, reports, and findings
1
Generate an API token
Settings → API Keys → Create Token. Copy the token — it's only shown once.
2
Authenticate
Pass your token in every request header: X-API-Key: arcscan_<your_token>
3
Submit a scan
curl -X POST https://your-domain/api/v1/scan/ \ -H "X-API-Key: arcscan_YOUR_TOKEN" \ -H "Content-Type: application/json" \ -d '{"name": "deploy.yml", "content": "---\n- hosts: all\n tasks:\n ..."}'
4
Retrieve reports
curl https://your-domain/api/v1/reports/ \ -H "X-API-Key: arcscan_YOUR_TOKEN"
Full API reference at Generate → API Docs.
CVE Watch
Get email alerts when new CVEs match your packages or roles
1
Open CVE Watch
Settings → CVE Watch (or search for it in Smart Search).
2
Add packages to watch
Enter the package/role name and select the ecosystem: Ansible Galaxy PyPI npm Generic keyword. Set your severity threshold (High+ recommended).
3
Receive alerts
ArcScan checks the NVD CVE 2.0 API weekly. When a new CVE is published matching your package at or above your threshold, you'll receive an email with the CVE ID, CVSS score, description, and NVD link.
Speed up checks: Admins can add an NVD_API_KEY environment variable for higher NVD API rate limits.
Credential Store
Store secrets that auto-inject into generated playbooks
2
Add a secret
Click New Secret. Enter a name (used as a Jinja2 variable, e.g. db_password), value, and optional description. Secrets are encrypted with Fernet and never appear in risks.
3
Auto-injection
When generating playbooks or running audits, ArcScan substitutes vault references () with the actual encrypted Ansible Vault values in the output. The raw secret never touches the AI provider.
4
Reveal or rotate
Click the eye icon to temporarily reveal a secret for verification. Click Edit to rotate the value — the audit log records the change timestamp.
External Vaults
Connect HashiCorp Vault, AWS Secrets Manager, or Azure Key Vault
2
Choose your vault type
HashiCorp Vault — KV v2, Token or AppRole auth.
AWS Secrets Manager — IAM role or access keys.
Azure Key Vault — Service principal + vault URL.
3
Test and save
Click Test Connection. ArcScan does a read-only lookup to verify credentials. Connection details are stored encrypted.
4
Reference in playbooks
Use the external vault path syntax when generating playbooks. ArcScan resolves paths at generation time and wraps them in !vault tags for Ansible Vault compatibility.
Create an Organisation
Share reports, credentials, and scan history with your team
1
Go to Team Workspaces
Settings → Team WorkspacesCreate Organisation.
2
Set name and invite members
Enter a display name and optional branding. Invite members by email — they receive an invitation link and are added with Member role by default.
3
Share an org-level API key
Create a shared API key for the org so all members can run scans under the same account without sharing individual keys.
4
View org reports
On the Dashboard, the Team Reports tab shows all scans by org members. Owners and admins can see all reports; members see their own.
Roles & Permissions
Owner → Admin → Member → Viewer hierarchy
OwnerFull access. Can delete org, promote/demote members, manage billing and SSO.
AdminSame as Owner minus billing and org deletion. Can invite/remove members.
MemberCan scan, generate, view all org reports, manage their own vault secrets.
ViewerRead-only access to org reports and posture dashboards. Cannot scan or generate.
Change a member's role in Settings → Team Workspaces by clicking the role badge next to their name. Role changes take effect immediately.
AWX / AAP Setup
Browse and scan playbooks from Ansible Automation Platform
1
Get your AWX credentials
In AWX/AAP, go to Users → your user → Tokens and create a Personal Access Token with Read scope. Copy the token. Alternatively, use a username + password.
2
Configure in ArcScan
Go to Integrations → AWX / AAP Settings. Enter:
  • Instance URL — e.g. https://awx.company.com
  • Token — the PAT you created (or username + password)
  • Verify SSL — uncheck only for self-signed certs
3
Test connection
Click Test Connection. You should see the AWX version. Then click Sync Job Templates to pull in your templates.
4
Browse playbooks
Go to Integrations → AAP Playbooks to browse all projects and playbooks. Click any playbook to preview it, then Scan to run an audit.
Score gating: Set Min Score to Run (e.g. 60) to block AWX job templates from running if their playbook scores below the threshold.
GitHub Integration
Browse repos, auto-scan PRs, create remediation PRs
1
Connect via OAuth
Go to Integrations → Cloud Storage and click Connect GitHub. Authorize ArcScan to access your repositories.
2
Browse and scan
On the Scan page, click the GitHub button in the import row. Browse repos → navigate folders → select a YAML file → Import. Content loads into the editor for scanning.
3
PR auto-scan (optional)
Set up a webhook to auto-scan playbooks on every pull request:
Repo → Settings → Webhooks → Add webhook Payload URL: https://your-arcscan.com/webhooks/github/ Content type: application/json Secret: (set GITHUB_WEBHOOK_SECRET in .env) Events: Pull requests
4
Remediation PRs
After scanning, click Create Remediation PR on any report. ArcScan auto-fixes findings and opens a PR against your repo.
GitLab & Bitbucket work the same way — connect via Cloud Storage, browse repos, set up webhooks at /webhooks/gitlab/.
Azure DevOps
Browse repos, scan playbooks, auto-create work items from findings
1
Create a Personal Access Token
In Azure DevOps: User Settings → Personal Access Tokens → New Token. Grant these scopes:
  • Code — Read (to browse repos and files)
  • Work Items — Read & Write (for auto-creating bugs)
2
Configure in ArcScan
Go to Integrations → Azure DevOps. Enter:
  • Organization URLhttps://dev.azure.com/your-org
  • PAT Token — paste the token (encrypted at rest)
  • Project — the project name to browse
3
Browse and scan
On the Scan page, click Azure DevOps → select a repo → navigate to your playbook → Import.
Auto work items: Enable Auto-create work items to automatically file Bugs in Azure DevOps for critical/high findings.
Wiz Integration
Import Wiz issues and auto-generate remediation playbooks
1
Get Wiz API credentials
In Wiz: Settings → Service Accounts → Create. Grant read:issues and read:resources scopes. Copy the Client ID and Client Secret.
2
Configure in ArcScan
Go to Integrations → Wiz → Settings. Enter your Client ID, Client Secret, and the Wiz API URL (e.g. https://api.us1.app.wiz.io).
3
Run a scan
Go to Integrations → Wiz and click Scan. ArcScan pulls your Wiz issues, groups them, and generates remediation playbooks using AI.
4
Auto-remediate
Review the generated playbooks, download them, or push directly as a PR to your SCM. Schedule recurring scans for continuous remediation.
Terraform Cloud
Link Terraform Cloud for IaC governance and deployment gating
1
Create a TFC API token
In Terraform Cloud: User Settings → Tokens → Create an API token. Copy the token.
2
Configure in ArcScan
Go to Integrations → Terraform Cloud. Enter your Organization name and API Token.
3
Scan Terraform configs
Go to Generate → Terraform to scan HCL files, or import Terraform state for drift detection.
Cloud Storage
Connect Google Drive, OneDrive, Dropbox, GitHub, GitLab, or Bitbucket
1
Go to Cloud Storage
Integrations → Cloud Storage shows all available providers with connect buttons.
2
Connect a provider
Click Connect next to the provider. You'll be redirected to their OAuth screen to authorize ArcScan. For Git providers (GitHub, GitLab, Bitbucket), grant read repository scope.
3
Browse and scan files
Once connected, the provider appears on the Scan page import row. Click it to browse your repos/files and import playbooks directly into the scanner.
Auto-sync: Set up folder sync to automatically scan playbooks when they change in a specific folder or repo path.
Self-hosted: For Docker deployments, you need OAuth client credentials. Register an OAuth app with each provider and set the client ID/secret in your .env: GITHUB_STORAGE_CLIENT_ID, GITHUB_STORAGE_CLIENT_SECRET, etc.
Jira Integration
Auto-create issues from findings and sync application Epics
1
Create an API token
Go to Atlassian API tokens and create a new token. Copy it.
2
Configure in ArcScan
Go to Integrations → Jira. Enter:
  • Jira URLhttps://your-org.atlassian.net
  • Email — your Atlassian email
  • API Token — paste the token
  • Project Key — e.g. SEC or OPS
3
Test and enable
Click Test Connection, then enable Auto-create issues to file tickets for critical/high findings automatically.
4
Application Epic sync
When tagging cloud resources to an application, check Jira in the Sync to ITSM section. ArcScan creates an Epic for the application with Sub-tasks for each tagged resource. See ITSM Application-Resource Sync.
FreshService Integration
Auto-create tickets/changes and sync application assets
1
Get your API key
In FreshService: Profile → API Key. Copy the key.
2
Configure in ArcScan
Go to Integrations → FreshService. Enter your domain (e.g. mycompany.freshservice.com) and API Key. Optionally set agent, group, and department.
3
Auto-create tickets or changes
Choose Ticket or Change as the record type. ArcScan automatically creates records when scans complete with findings above your score threshold.
4
Application asset sync
When tagging cloud resources to an application, check FreshService in the Sync to ITSM section. ArcScan creates an Asset for the application and linked assets for each resource. See ITSM Application-Resource Sync.
PagerDuty
Trigger incidents for critical scan findings
1
Get a routing key
In PagerDuty: Services → your service → Integrations → Add Integration → Events API v2. Copy the Integration Key (routing key).
2
Configure in ArcScan
Go to Integrations → PagerDuty. Enter your Routing Key and Service ID.
3
Set trigger thresholds
Enable Auto-trigger on Critical and/or Auto-trigger on High severity findings. PagerDuty incidents are created automatically when findings match.
Datadog
Push scan metrics and events to Datadog dashboards
1
Get API and App keys
In Datadog: Organization Settings → API Keys (copy API key) and Application Keys (create and copy an app key).
2
Configure in ArcScan
Go to Integrations → Datadog. Enter your API Key, App Key, and site (e.g. datadoghq.com or datadoghq.eu).
3
Enable push options
Toggle Push Metrics (scan scores, finding counts) and Push Events (scan completed, critical findings) to populate your Datadog dashboards.
OPA Security Rules
Evaluate playbooks against custom Rego policies
1
Deploy OPA
Run an OPA server (Docker: docker run -d -p 8181:8181 openpolicyagent/opa run --server) or use an existing instance.
2
Configure in ArcScan
Go to Integrations → OPA. Enter your Server URL (e.g. http://opa:8181) and optional auth token.
3
Upload policies
Upload Rego policies via the OPA settings page, or push them directly to OPA's API. Set a default package (e.g. arcscan.ansible) for automatic evaluation.
Snyk
Scan dependencies for known vulnerabilities
1
Get a Snyk API token
In Snyk: Account Settings → General → Auth Token. Copy the token.
2
Configure in ArcScan
Go to Integrations → Snyk. Enter your API Token and Organization ID (found in Snyk org settings).
3
Enable auto-scan
Toggle Auto-scan to run Snyk vulnerability checks alongside every ArcScan playbook audit.
SSO Setup
Google, Microsoft, GitHub, GitLab, or SAML 2.0
1
Open SSO Settings
Settings → SSO (Enterprise plan required).
2
Choose provider
Google Microsoft GitHub GitLab SAML 2.0. For OAuth providers, register ArcScan as an app in the provider's developer console and enter Client ID + Secret.
3
Set allowed domains
Enter a comma-separated list of email domains (e.g. company.com, subsidiary.com). Only users with those email domains can sign in via SSO.
4
Test and enforce
Click Test SSO to open a pop-up login flow. Once verified, optionally enable Enforce SSO to prevent password logins for org members.