Netwrix Enterprise Auditor (formerly StealthAUDIT) 11.6 has been released LEARN MORE

Collect and Analyze Relevant Data Points to Assess Risk

Blog >Collect and Analyze Relevant Data Points to Assess Risk
DAG, Data Access Governance

As the amount of data managed by companies continues to grow both in volume and importance, so does the criticality of ensuring access to this data is controlled. In part 1 of this 5-part ‘Checkbox Compliance to True Data Security’ blog series, we took you through the Discovery process. Now that you know your organization’s data footprint, the next step to true data security is the Collect and Analyze phase.

The goal of the Collect and Analyze phase is to assess relevant data points to answer critical questions like what’s the sensitivity of the data, who has access to it, who owns it, and what’s the age of that data.  When you begin to understand the answers to these questions, you can then begin prioritizing the resources that are at most risk and limiting access to them as you work towards achieving a Least Privilege Access model.

The Principle of Least Privilege is THE Goal

The Principle of Least Privilege is essentially the idea that access to data and resources should be provisioned to the bare minimum permissions necessary to perform a job function. Think about it this way, you wouldn’t want to give a marketing manager access to the salaries of everyone in the company, right? Most organizations would want to limit that particular access to only HR representatives and the CEO/CFO, for example. Often, most administrators cannot easily answer the basic question of what an existing user currently has access to, let alone if that user needs that access in the first place.

The risk of ignoring the access issue is more critical today due to the increased scrutiny of auditors and compliance legislation. The EU GDPR regulation, for example, is probably top of mind for your security team. In order to be compliant with GDPR, you must know for certain where an EU citizen’s personal data is stored and who has access to it.

There’s also the ever-present risk of insider threat which has made the current lack of effective controls very apparent. To get to a proper access control system you need to collect and analyze relevant data points to figure out which data is considered sensitive. To help you think that through, here are a few questions to consider:

  • Is your organization in a regulated industry (e.g. healthcare, finance, construction, oil)?
  • Do you collect personally identifiable information (PII) like social security numbers, date of birth, or driver’s license numbers?
  • Do you know if your organization has company-specific proprietary information?

Yes, Your Organization Has Sensitive Data

If you’ve answered yes to any or all of the above, then you have data that is considered sensitive. It’s also important to consider where you’re looking for this data.  When we went through the Discovery phase, we discussed structured data and unstructured data, and how it’s important to look for sensitive data across all of these repositories.  Unstructured data is especially challenging when you consider that sensitive data could be hiding in emails, spreadsheets, images, and hundreds of other file formats.

Another important set of data to collect about your data is file metadata.  Along with information about file authors and owners, last modified and last access dates, file names and types, you should also be collecting and analyzing existing classifications and tags.  All of this information helps to expedite the process of figuring out who owns this data (as will activity data collected during the “Monitor” phase of the process), how old it is, and what can or should eventually be done with it.

Stale Data Costs Just as Much Money as Active Data

Assessing and analyzing data usage is critical. Stale data can be very costly to your organization, as data storage is a major cost for most businesses. In fact, according to Business Insider, businesses around the world are spending an average of an estimated $62B on storage a year. In our research, we found that this high storage cost comes from the fact that most administrators aren’t reviewing and deleting stale data proactively, so it continues to take up space on an organization’s on-premises servers or cloud storage.

By the way, STEALTHbits offers a storage reclamation calculator (see below) to help organizations figure out how much they can save through the reduction of wasted storage space.

Annual Storage Cost & Storage Reclamation Calculator:

The ultimate goal in the Collect and Analyze phase is to understand what you need to do and in what order as you implement a true data security program. In the next blog post of the series, we’ll show you why it’s important to monitor file activity before you perform any remediation.

See upcoming blog posts in the series below:

Don’t want to miss any blog posts in this series? Subscribe to be notified as new posts are added to this series, here.

Free CPE Webinar

Join Adam Laub, our SVP of Product Marketing, on September 5, 2018 for the 2nd webinar in this series, ” Data Footprint: Understanding Data Sensitivity and Prioritizing Risk.” He’ll walk you through best methods to locate all your data, how to best classify it, and show you why you should monitor data activity before taking action. At the end of this webinar, you’ll be able to compile a priority list of the active files and shares that put your company at most risk. This series qualifies for CPE credit.

Register for the upcoming Data Footprint: Understanding Data Sensitivity and Prioritizing Risk webinar, here.

Don’t miss a post! Subscribe to The Insider Threat Security Blog here:


Featured Asset

Leave a Reply

Your email address will not be published. Required fields are marked *




© 2022 Stealthbits Technologies, Inc.

Start a Free Stealthbits Trial!

No risk. No obligation.