Title: Best Practices and Tool Recommendations for DAST in CI/CD Pipelines
With the adoption of agile development and DevOps practices, there is a clear push toward automated security testing throughout the software delivery lifecycle. I am interested in a focused discussion on Dynamic Application Security Testing (DAST) approaches specifically tailored for integration within CI/CD pipelines.
Tooling: What DAST solutions (commercial or open-source) have members found most reliable and accurate in modern pipeline environments? For those using tools like OWASP ZAP, Burp Suite Enterprise, or AppScan, what integration challenges have you encountered?
False Positives/Noise Reduction: How do you manage the high rate of false positives that DAST tools can produce, especially in rapid-release scenarios? Are there proven techniques for tuning scans or post-processing results to avoid alert fatigue?
Scoping and Authentication: What strategies are effective for dynamically scoping scans to newly introduced endpoints or microservices? How are session management and authenticated scan scenarios handled in ephemeral test environments?
Balancing Depth and Pipeline Speed: Considering the time constraints of CI/CD, what approaches are effective for balancing thorough security coverage versus minimizing pipeline latency? Are there any incremental or risk-based scanning methodologies that have proven effective?
Metrics and Continuous Improvement: What metrics are most effective for tracking the impact of DAST in the pipeline, and how do teams iterate on scan configurations over time?
Looking for insights based on practical experience rather than vendor materials. Detailed workflows, common pitfalls, and open questions for future improvement are especially welcome.