Get the Inside Scoop on Platform Management Assessments
We recently grabbed some time with one of the Senior Consultants here at RiverSafe to get a more detailed view of what actually goes on when RiverSafe conducts a Platform Management Assessment for its customers.
Here are the key takeaways and insights from our conversation:
Why do RiverSafe provide this assessment and why do organisations need it? “We’ve worked in so many SIEM, SOAR and UEBA environments over the years, we’ve gained a really broad perspective on the best way to manage them. So many of the technical issues we see are rooted in the management practices surrounding the platforms. Our Assessment gives organisations an opportunity to share that experience.”
What are the main reasons organisations approach you for an Assessment?
“There are a variety of reasons to be honest. For some organisations it’s been incident driven; something’s gone wrong which has had a business impact and, as a result, they wanted to get a second opinion on how they’re doing things.
For other organisations, it’s been triggered as part of a broader organisational review of activity and they are evaluating how they’re doing things more generally.
The common thread across all the organisations I’ve done this for is a desire to improve the management processes around their security platforms, and a belief that some input from an organisation that has experience in this area would be very useful.”
How does the process normally kick off?
“The assessment conversation normally takes an hour or two. We follow a standard list of questions to drive it, which makes sure we’re taking a consistent approach. We normally do this over a call with someone in the target organisation who has some insight into both the technical and operational aspects of the SIEM, SOAR or UEBA. In some cases, that’s a few people rather than one, but that works fine.”
What happens next?
“We apply a gap analysis against a best practice template. Throughout this process we are thinking about completeness (are all of the processes we’d expect to see in place?), about frequency (for example, are health checks performed with appropriate frequency given the rate of change in the environment?), about effectiveness (does the process achieve its objectives?) and finally we are thinking about the time involved and resource cost.
Then we validate all of that against current outcomes. For example, if it looks like the organisation has a process for software updates that seems to be implemented at a good frequency, but in practice they are a few versions behind, that tells us something useful.
We turn that analysis into a set of recommendations and try to make sure those recommendations are easy to turn into actions. We don’t want the assessment to sit on a shelf; it’s intended to drive improvement.”
What are some examples of the kinds of changes customers have made as a result of the Assessment?
“Again, this varies. In some cases, it can be the addition of new processes, whereas in other cases it’s scaling back in some areas and upping the effort in others. Most often, it’s tweaks to how the current processes operate, or highlighting features in the platform that aren’t being utilised but could make life easier for them.
We always finish up with a compare and contrast with our own Managed Service, which is both a great frame of reference for best practice and also helps the target organisation understand what an industry best practice service would look like as an alternative.”
See our detailed Platform Management Assessment brochure here