Some amazing stats in here (free to download just register your email). I literally want to copy and paste half the report here, but that’s probably bad manners to the authors.
Gartner predicted 15% would use SOAR by 2020 but Ponemon survey finds that 46% “expect to use it in the next six to 12 months” (I accept not all automation is SOAR, but this is a security conversation, so it maybe it should be).
“[…] Unfortunately improvements in staffing are not happening.”
I’ve written before that SOAR won’t necessarily replace half your team (though it can lead to reduced workload) and that’s mirrored by the audience, though I didn’t expect people to expect an increase:
23% say “Automation will reduce the headcount of our IT security function”
Whilst 44% say “Automation will increasethe need to hire people with more advanced technical skills”
Two of the main reasons AI (which as we all know is really ML) is needed is to replace human error and improve 24/7 monitoring and response.
7 great points that come up often. A couple of these I’ve already touched on in this blog (e.g. automation doesn’t replace head count), but the above link does a much better job exploring in detail than I can.
Here is another interesting chat at RSA Conference this year. A gentleman approached me asking if we could help with their problem of moving data and insider threat.
His organisation policy makers were happy to use Cloud for standard business services, but not storing their sensitive data (he wouldn’t tell me the specifics). Anytime they wanted to move data from that ‘area’ of the network to the cloud they were refused by policy in case there was data leakage… you know… just to be safe.
His first problem I couldn’t help with, apparently encrypted VPN isn’t safe enough for transmission. Maybe they will end up with sneakernet and a suitcase + handcuffs.
The second problem though was a great use case for SOAR, and not one I’ve come across yet. The data source and data destination were from different vendors with no existing integration together. This means the process is very manual and potentially exposes sensitive data to the insider threat / operators.
So I demonstrated our playbook execution and how we communicate with end users. The final pseudo design we agreed on was:
A playbook that can be initiated by a schedule or by an inbound request
The playbook automatically restricts permissions of the ticket. Access is only granted with 2 pairs of eyes.
The playbook fetched the data from vendorA
The playbook then did some basic pattern matching against the data, file type checking, maybe push it through a DLP, and many more.
If the data was sensitive we can stop the process, flag the ticket, etc.
If the data was good we push it to the remote system and close the ticket.
However If the data was neither definitely good or definitely bad we can use CommunicationTasks to email a manager and the original ticket requester asking what to do? Proceed or stop?
Using our ComTask we can interactively engage the end user without exposing the data in question (see above)
To summarise, they can still do the process (quicker than before and with fewer mistakes), they’ve removed visibility to the data, but their workers still have the control to initiate and control the workflow. Pretty cool.
Thought not predominantly a SOC incident type, it shows that automation is automation, be as creative as you like.
Interactive web frontend to using API for all the information for guests
This will require each person use a unique code/password
Though creating online accounts is too complicated for Great Aunt Betty…
…so use a minimal URL including a unique code
And for mobile device convenience, a QR Image with the code baked in
With the needs identified I mocked up this workflow.
So here in part 1 of this 3 part series, are all the steps along with images!
Create a Demisto instance specifically for my Wedding
Design the schema for “guest” and map these to a “new ticket” form
Time to add all the guests (originally I started off typing these in manually, but I realised I’m lazy so I created a CSV, and then wrote a playbook to import and process each row)
The above playbook executes once creating dozens of tickets, one for each invitee. Each ticket then runs a playbook to prepare and process itself
The QR Code task calls the QRCodeMonkey integration. Here we give it a string “https://__url__/__uniquecode__” and it returns an image
Here is a QR Code generated by the automation (dummy data)
When I was happy the data looked right. I wrote a playbook to loop through every invitee and send an email to the printing company that contained the invitee names, address, unique URL and QRCode image as an attachment
And here is the output, lots emails, each specific to a specific invitee.
Here is the real benefit. Should we make any changes to the URL, email, QRCode, or individual invitee we can make 1 simple change and then execute the playbooks again. All data is regenerated, reprocessed and automatically emailed out… all with 1 click !