The Beehive, City Place, Gatwick, RH6 0PA, United Kingdom
+44 (0)20 801 74646

Venari

Jumpstart DevSecOps with the next generation of AppSec testing

Venari is a highly scalable, next generation, site discovery and vulnerability analyser that works with modern web frameworks. The Intelligent Crawl Engine (ICE) understands single page apps and tracks the changing DOM. The exploiter module uses adaptive probing to intelligently trace entry and exit points and map the entire attack surface. Scans can be run with automatic login for simple point and shoot analysis.

Large-scale application scanning requires container and cloud integration, API front ends and easy onboarding of new applications. Assert Security’s cross-platform Venari DevOps Edition is built from the ground up to be externally driven via APIs. Every micro-service is documented with the Open API Specification (Swagger) and scan nodes run equally well in Docker containers, VM’s or bare metal hardware. Our simple licensing model enables you to add capacity without complicated redeployments or locking keys to special servers.

A shared core analysis engine powers all of Assert’s products from the free desktop edition to the worker nodes in our automation cluster.   The scanning technology has been re-imagined to deeply inspect modern web frameworks and design patterns.  A fully headless, parallel browser engine interacts with the application in a fully logged in state and maintains that login as it discovers the full depth and breadth of the DOM.  These DOM snapshots and interactions feed the intelligent probing,  fingerprinting, crawl and exploit phases.  Auto-Login works in the majority of applications so the browser driver can re-acquire the login state as needed.  The combination of methodically walking the application UI states and adaptive probing and fuzzing leads to unmatched coverage and accuracy. For example, the exploiter can verify that injected script actually executed.

The scan engine fluidly handles Single Page Applications (SPA) and frameworks that make heavy use of XHRs, websockets and modern frameworks. This browser-first architecture is a quantum leap beyond more traditional AppSec tools that rely on spidering to reveal application content.

The automation platform is a collection of worker nodes that can run independently or collaboratively when needed.  Each analysis node is a self-contained scan engine that surfaces ALL of its capabilities via REST APIs.  These APIs are composed into smaller micro-services and are completely documented using the Open API Specification (SWAGGER). The scan nodes can be spun up and spun down. A master node persists the data exported by individual scan jobs and maintains this data in Application-specific workspaces. Scan templates, auto-generated workflows and findings are all stored in a NO SQL database. The database has a REST API for easy harvesting of result data

The overarching design goal is to provide components that allow businesses to fully orchestrate their security DevOps in an end-to-end automation loop.

A fully functional client UI is freely available on GitHub. The Assert scan node architecture was built from the ground up to be headless and automation friendly. The micro-service suite makes it easy to integrate with your existing build and test infrastructure.

Venari was build from the ground up to empower organisations to fully shift left.  Every desktop tool, automation module and extension will run  on Windows, Linux or Mac.  Early design choices like .NET Core, Electron, NodeJS, Angular, VS Code integration, and Docker enable us to put quality tools in the hands of developers, QA pros and AppSec specialists without restricting the OS

Scan nodes run independently once the master controller delivers a job.  When the job is complete they idle and wait for the next job.  This independence means that you can run as many scans in parallel as you have job nodes.

Certain scans may be expected to take a long time if the application is large or the rule set is large.  In these cases, scan jobs can be configured as ‘shared’ with a specific max number of nodes allowed to collaborate in parallel.  The maximum exists so that you can impose a limit to avoid overloading the application under test.  Any scan nodes that are sitting idle will be enlisted to help speed up the scan job.  This is effective because the browser analysis that fully exercises each navigable resource can run a pool of headless browser engines on each node.  The browser engine analysis is often the bottleneck in a large scan and this elastic collaboration spreads the compute and IO across multiple nodes.

REST APIs fully expose the capabilities of each node in the cluster.  The micro-services shown in the list below are from a growing list of automation hooks that enable integration with other AppSec tooling:

Job Controller Executes and monitors a single job on a specific node
Auth Controller OpenID Connect-based identity server
DB Controller Used internally and available for integration
Browser Controller Access browser’s crawl trees, executed actions, state details and generated workflows
Script Controller Control the pool of script execution modules
Traffic Controller Query HTTP traffic details from the scan
Findings Controller Query findings and details
Fingerprint Controller Probe, reflection and other fingerprint detail access
Rule Controller Manage extensible YAML rules available to the scan jobs
Workflow Controller Workflow and dataset access
Log Controller Log import/export and queries
Resource Controller Manage and access file artifacts for import and export to integrate with third party tool data
Workspace Controller Manage workspaces and associated data like findings

Practitioners in all AppSec roles can access the scan cluster via the JobViewer UI available on GitHub.  Access is restricted by the roles and permissions for the specific user.  There is a second user interface available as a VS Code extension.  This free plugin runs in the popular, cross-platform VS Code editor.  The Assert extension allows simple onboarding of new applications and the ability to import findings from a master workspace into a local workspace. Local workspace import enables auditing of scan results on the desktop.

Auto-Login can be tested and custom YAML rules can be created, edited and tested in the Rule IDE.  The rule customisation allows organisations to add tests specific to an application or their compliance testing policy.  All of Venari’s rules are freely available to the community and are extensible and testable in VS Code.

Applications are onboarded in the VS Code extension where the user provides credentials for auto-login and test that it works.  The test is as simple as watching the browser controller drive headless chrome to achieve the login from a starting point of just the credentials.  If Auto-login cannot compute a successful login path and click stream then the user can record the path.  The user can also specify advanced configuration and/or use prerecorded workflows or ones auto-generated by a test scan. Once the desired configuration exists to enable a good scan, the data is uploaded to the master control node so that it can be used by scheduled, triggered or manually initiated scans in the node cluster.

The VS Code extension has a powerful single-step remediation feature that uses scan metadata about how the vulnerability was found and all of the information needed to reproduce it locally.  No need to repeat a multiple hour scan to verify a fix or identify a false positive.  The extension replays the steps needed to isolate the browser interactions – including login – to refuzz the vulnerable area of the app and check the result.  The HTTP and browser data from the original scan can be imported to the extension where the urls and endpoint information is translated into a locally executable sandbox.

Assert’s automation nodes can run in Docker containers, virtual machines or on bare-metal hardware.  The docker use case is specifically aimed at short-lived instances that get spun up to execute a scan and then export data to the master node and disappear. A generic worker node does not require specific installation of license keys or any other SW (like a commercial DB).  The nodes are lightweight and meant to be discardable.  The master control node is a bit different and may warrant either a long-lived container or a dedicated piece of hardware.  But that is not a requirement.  The decision comes down to how and where persistent data needs to be exported and stored and this will vary.  If an organization is dedicated  to zero physical infrastructure, the master node can be run in a container. Adding more capacity is a simple matter of configuring max concurrency on the master control node.

The licensing model is deliberately simple and intended to remove friction from deployment and to resolve the mystery of ‘how many licenses do I need?’ The master controller is the sole enforcement of how much scanning can be done in parallel.  If you find that you need more capacity for your test load, you can purchase additional node counts.  We will provide the data needed to make a simple settings change to the master node. There is nothing to do in the worker nodes.  They run when the master controller tells them to so license configuration on these nodes does not exist.

Here is a  list of things that will make life simple:

  • No need to run setup.exe and configure a special machine to meet all the prerequisites
  • No need to download and apply license key information to the scan box
  • Easy automation of license deactivation and reactivation on another image
Free Trial

Fill in the form below to get an evaluation of Venari


* These fields are required.