Prowler API patterns: RLS, RBAC, providers, Celery tasks.
Trigger: When working in api/ on models/serializers/viewsets/filters/tasks involving tenant isolation (RLS), RBAC, or provider lifecycle.
For generic DRF patterns (ViewSets, Serializers, Filters, JSON:API), use django-drf skill.
Critical Rules
ALWAYS use rls_transaction(tenant_id) when querying outside ViewSet context
ALWAYS use get_role() before checking permissions (returns FIRST role only)
ALWAYS use @set_tenant then @handle_provider_deletion decorator order
ALWAYS use explicit through models for M2M relationships (required for RLS)
NEVER access Provider.objects without RLS context in Celery tasks
NEVER bypass RLS by using raw SQL or
connection.cursor()
NEVER use Django's default M2M - RLS requires through models with tenant_id
Note: rls_transaction() accepts both UUID objects and strings - it converts internally via str(value).
Architecture Overview
4-Database Architecture
Database
Alias
Purpose
RLS
default
prowler_user
Standard API queries
Yes
admin
admin
Migrations, auth bypass
No
replica
prowler_user
Read-only queries
Yes
admin_replica
admin
Admin read replica
No
# When to use admin (bypasses RLS)
from api.db_router import MainRouter
User.objects.using(MainRouter.admin_db).get(id=user_id) # Auth lookups
# Standard queries use default (RLS enforced)
Provider.objects.filter(connected=True) # Requires rls_transaction context
RLS Transaction Flow
Request β Authentication β BaseRLSViewSet.initial()
β
ββ Extract tenant_id from JWT
ββ SET api.tenant_id = 'uuid' (PostgreSQL)
ββ All queries now tenant-scoped
Normal queries β Model.objects (excludes deleted)
Include deleted records β Model.all_objects
Celery task context β Must use rls_transaction() first
Which Database?
Standard API queries β default (automatic via ViewSet)
Read-only operations β replica (automatic for GET in BaseRLSViewSet)
Auth/admin operations β MainRouter.admin_db
Cross-tenant lookups β MainRouter.admin_db (use sparingly!)
from api.rls import RowLevelSecurityProtectedModel, RowLevelSecurityConstraint
class MyModel(RowLevelSecurityProtectedModel):
# tenant FK inherited from parent
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
name = models.CharField(max_length=255)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "my_models"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class JSONAPIMeta:
resource_name = "my-models"
M2M Relationships (MUST use through models)
class Resource(RowLevelSecurityProtectedModel):
tags = models.ManyToManyField(
ResourceTag,
through="ResourceTagMapping", # REQUIRED for RLS
)
class ResourceTagMapping(RowLevelSecurityProtectedModel):
# Through model MUST have tenant_id for RLS
resource = models.ForeignKey(Resource, on_delete=models.CASCADE)
tag = models.ForeignKey(ResourceTag, on_delete=models.CASCADE)
class Meta:
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
Async Task Response Pattern (202 Accepted)
For long-running operations, return 202 with task reference:
Adding new provider: Add to ProviderChoices enum + create validate_<provider>_uid() staticmethod.
RBAC Permissions
Permission
Controls
MANAGE_USERS
User CRUD, role assignments
MANAGE_ACCOUNT
Tenant settings
MANAGE_BILLING
Billing/subscription
MANAGE_PROVIDERS
Provider CRUD
MANAGE_INTEGRATIONS
Integration config
MANAGE_SCANS
Scan execution
UNLIMITED_VISIBILITY
See all providers (bypasses provider_groups)
RBAC Visibility Pattern
def get_queryset(self):
user_role = get_role(self.request.user)
if user_role.unlimited_visibility:
return Model.objects.filter(tenant_id=self.request.tenant_id)
else:
# Filter by provider_groups assigned to role
return Model.objects.filter(provider__in=get_providers(user_role))
Celery Queues
Queue
Purpose
scans
Prowler scan execution
overview
Dashboard aggregations (severity, attack surface)
compliance
Compliance report generation
integrations
External integrations (Jira, S3, Security Hub)
deletion
Provider/tenant deletion (async)
backfill
Historical data backfill operations
scan-reports
Output generation (CSV, JSON, HTML, PDF)
Task Composition (Canvas)
Use Celery's Canvas primitives for complex workflows:
Primitive
Use For
chain()
Sequential execution: A β B β C
group()
Parallel execution: A, B, C simultaneously
Combined
Chain with nested groups for complex workflows
Note: Use .si() (signature immutable) to prevent result passing. Use .s() if you need to pass results.
cd api && poetry run python src/backend/manage.py check --deploy
Critical Settings
Setting
Production Value
Risk if Wrong
DEBUG
False
Exposes stack traces, settings, SQL queries
SECRET_KEY
Env var, rotated
Session hijacking, CSRF bypass
ALLOWED_HOSTS
Explicit list
Host header attacks
SECURE_SSL_REDIRECT
True
Credentials sent over HTTP
SESSION_COOKIE_SECURE
True
Session cookies over HTTP
CSRF_COOKIE_SECURE
True
CSRF tokens over HTTP
SECURE_HSTS_SECONDS
31536000 (1 year)
Downgrade attacks
CONN_MAX_AGE
60 or higher
Connection pool exhaustion
Commands
# Development
cd api && poetry run python src/backend/manage.py runserver
cd api && poetry run python src/backend/manage.py shell
# Celery
cd api && poetry run celery -A config.celery worker -l info -Q scans,overview
cd api && poetry run celery -A config.celery beat -l info
# Testing
cd api && poetry run pytest -x --tb=short
# Production checks
cd api && poetry run python src/backend/manage.py check --deploy