Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions .github/label-codeowners.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
{
"1secure": "@netwrix/1secure-docs",
"access-analyzer": "@netwrix/accessanalyzer-docs",
"access-information-center": "@netwrix/accessinformationcenter-docs",
"activity monitor": "@netwrix/activitymonitor-docs",
"auditor": "@netwrix/auditor-docs",
"change-tracker": "@netwrix/changetracker-docs",
"data-classification": "@netwrix/dataclassification-docs",
"directory-manager": "@netwrix/directorymanager-docs",
"endpoint-policy-manager": "@netwrix/endpointpolicymanager-docs",
"endpoint-protector": "@netwrix/endpointprotector-docs",
"identity-manager": "@netwrix/identitymanager-docs",
"identity-recovery": "@netwrix/recoveryforactivedirectory-docs",
"password-policy-enforcer": "@netwrix/passwordpolicyenforcer-docs",
"password-reset": "@netwrix/passwordreset-docs",
"password-secure": "@netwrix/passwordsecure-docs",
"pingcastle": "@netwrix/pingcastle-docs",
"plat-gov-net-suite": "@netwrix/platgovnetsuite-docs",
"plat-gov-salesforce": "@netwrix/platgovsalesforce-docs",
"privilege-secure": "@netwrix/privilegesecure-docs",
"privilege-secure-discovery": "@netwrix/privilegesecurediscovery-docs",
"threat-manager": "@netwrix/threatmanager-docs",
"threat-prevention": "@netwrix/threatprevention-docs",
"kb": "@netwrix/kb-docs"
}
43 changes: 43 additions & 0 deletions .github/workflows/claude-issue-labeler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,49 @@ jobs:
Issue body may have been modified by previous steps.
claude_args: '--allowedTools "Bash(gh:*),Skill(assign-label)"'

- name: Step 4 — Notify codeowners
if: steps.check-state.outputs.issue_state == 'OPEN' && github.event_name == 'issues'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Fetch current labels from the issue (post-Step 3)
LABELS=$(gh issue view ${{ github.event.issue.number }} --repo ${{ github.repository }} --json labels --jq '.labels[].name')

if [ -z "$LABELS" ]; then
echo "No labels found — skipping codeowner notification"
exit 0
fi

# Read the mapping file
MAPPING=$(cat .github/label-codeowners.json)

# Collect matched teams (deduplicated)
TEAMS=""
while IFS= read -r label; do
team=$(echo "$MAPPING" | jq -r --arg l "$label" '.[$l] // empty')
if [ -n "$team" ]; then
# Deduplicate (identity-recovery and another label could map to same team)
if ! echo "$TEAMS" | grep -qF "$team"; then
TEAMS="${TEAMS:+$TEAMS }$team"
fi
fi
done <<< "$LABELS"

if [ -z "$TEAMS" ]; then
echo "No product labels matched — skipping codeowner notification"
exit 0
fi

echo "Notifying teams: $TEAMS"

# Post a single comment tagging all matched teams
COMMENT="Notifying codeowners: ${TEAMS}"
COMMENT="${COMMENT}"$'\n\n'"Please review this issue when you have a chance."

gh issue comment ${{ github.event.issue.number }} \
--repo ${{ github.repository }} \
--body "$COMMENT"

content-fix:
needs: process-issue
if: >-
Expand Down
47 changes: 47 additions & 0 deletions docs/accessanalyzer/11.6/admin/jobs/instantjobs/fs_sdd_delete
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@

# FS_SDD_DELETE Instant Job

The FS_SDD_DELETE instant job deletes Sensitive Data Discovery (SDD) data from the Tier 1 database for specified criteria, hosts, or combinations thereof. This job is available in the Instant Job Library under the File System library.

![FS_SDD_DELETE Job](/images/accessanalyzer/11.6/admin/jobs/instantjobs/fs-sdd-delete.png)

## Runtime Details

- **Dependencies**: The 0.Collection Job Group must be successfully run before running this job
- **Target Hosts**: None (select Local host)
- **Scheduling**: Can be run as desired, typically on an ad-hoc basis
- **History Retention**: Not supported and must be turned off
- **Multi-console Support**: Not supported
- **Additional Notes**: This job performs permanent data deletion with no undo capability. All analysis tasks are disabled by default to prevent accidental data loss.

The FS_SDD_DELETE instant job provides a controlled method for removing Sensitive Data Discovery data from your Tier 1 database. Use this job to clean up SDD data for specific criteria, remove data associated with decommissioned hosts, or delete specific host-and-criteria combinations. Because this job permanently deletes data with no recovery option, all analysis tasks are disabled by default as a safety measure.

## Analysis Tasks for the FS_SDD_DELETE Job

To see the analysis tasks for this job, navigate to **Jobs > Instant Job Library > File System > FS_SDD_DELETE** and select the **Analysis Tasks** tab.

> **WARNING**: This job permanently deletes data from the database. This action cannot be undone. All analysis tasks are disabled by default to prevent accidental data loss. Carefully review the data to be deleted before enabling and running any analysis task.

The following analysis tasks are available for the FS_SDD_DELETE job:

- **Delete Criteria** – Deletes all SDD data for specified criteria from all hosts. Use this task when you want to remove all occurrences of specific criteria across your entire environment.
- **Delete Host** – Deletes all SDD data related to a specific host. Use this task when decommissioning a host or removing all SDD data associated with a particular system.
- **Remove Host & Criteria** – Deletes all SDD data for a specific host and criteria combination. Use this task for targeted removal of SDD data for a specific criterion on a specific host.

### Configuring the Analysis Tasks

Each analysis task requires manually populating temporary database tables before execution. Follow these steps to configure and run an analysis task:

1. Open SQL Server Management Studio and connect to your Tier 1 database.
2. Determine which analysis task you need to run based on the data you want to delete.
3. Populate the required temporary table(s):
- For **Delete Criteria**: Populate the `#Criteria` temporary table with the criteria names you want to delete
- For **Delete Host**: Populate the `#hosts` temporary table with the host names you want to delete
- For **Remove Host & Criteria**: Populate both the `#hosts` and `#Criteria` temporary tables with the specific host and criteria combinations
4. In the Access Analyzer console, navigate to **Jobs > Instant Job Library > File System > FS_SDD_DELETE**.
5. Select the **Analysis Tasks** tab.
6. Right-click the appropriate analysis task and select **Enable**.
7. Review the enabled task to verify it will delete the correct data.
8. Run the job by clicking **Run Job** and selecting **Local** as the target host.
9. After the job completes, verify the data has been deleted as expected.
10. Disable the analysis task to prevent accidental future deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@

# FS_SDD_DELETE Instant Job

The FS_SDD_DELETE instant job deletes Sensitive Data Discovery (SDD) data from the Tier 1 database for specified criteria, hosts, or combinations thereof. This job is available in the Instant Job Library under the File System library.

![FS_SDD_DELETE Job](/images/accessanalyzer/12.0/admin/jobs/instantjobs/fs-sdd-delete.png)

## Runtime Details

- **Dependencies**: The 0.Collection Job Group must be successfully run before running this job
- **Target Hosts**: None (select Local host)
- **Scheduling**: Can be run as desired, typically on an ad-hoc basis
- **History Retention**: Not supported and must be turned off
- **Multi-console Support**: Not supported
- **Additional Notes**: This job performs permanent data deletion with no undo capability. All analysis tasks are disabled by default to prevent accidental data loss.

The FS_SDD_DELETE instant job provides a controlled method for removing Sensitive Data Discovery data from your Tier 1 database. Use this job to clean up SDD data for specific criteria, remove data associated with decommissioned hosts, or delete specific host-and-criteria combinations. Because this job permanently deletes data with no recovery option, all analysis tasks are disabled by default as a safety measure.

## Analysis Tasks for the FS_SDD_DELETE Job

To see the analysis tasks for this job, navigate to **Jobs > Instant Job Library > File System > FS_SDD_DELETE** and select the **Analysis Tasks** tab.

> **WARNING**: This job permanently deletes data from the database. This action can't be undone. All analysis tasks are disabled by default to prevent accidental data loss. Carefully review the data to be deleted before enabling and running any analysis task.

The following analysis tasks are available for the FS_SDD_DELETE job:

- **Delete Criteria** – Deletes all SDD data for specified criteria from all hosts. Use this task when you want to remove all occurrences of specific criteria across your entire environment.
- **Delete Host** – Deletes all SDD data related to a specific host. Use this task when decommissioning a host or removing all SDD data associated with a particular system.
- **Remove Host & Criteria** – Deletes all SDD data for a specific host and criteria combination. Use this task for targeted removal of SDD data for a specific criterion on a specific host.

### Configuring the Analysis Tasks

Each analysis task requires manually populating temporary database tables before execution:

1. Open SQL Server Management Studio and connect to your Tier 1 database.
2. Determine which analysis task you need to run based on the data you want to delete.
3. Populate the required temporary tables:
- For **Delete Criteria**: Populate the `#Criteria` temporary table with the criteria names you want to delete
- For **Delete Host**: Populate the `#hosts` temporary table with the host names you want to delete
- For **Remove Host & Criteria**: Populate both the `#hosts` and `#Criteria` temporary tables with the specific host and criteria combinations
4. In the Access Analyzer console, navigate to **Jobs > Instant Job Library > File System > FS_SDD_DELETE**.
5. Select the **Analysis Tasks** tab.
6. Right-click the appropriate analysis task and select **Enable**.
7. Review the enabled task to verify it will delete the correct data.
8. Run the job by clicking **Run Job** and selecting **Local** as the target host.
9. After the job completes, verify the data has been deleted as expected.
10. Disable the analysis task to prevent accidental future deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
---
description: >-
This article provides a script to properly update the oracle schema if the collation causes an error.
keywords:
- Access Analyzer
- SQL Server
- Oracle
- database permissions
sidebar_label: Cannot Resolve the Collation Conflict
tags: []
title: "Cannot Resolve the Collation Conflict Between SQL_Latin1_General_CP1_CI_AS and SQL_Latin1_General_CP1_CS_AS"
products:
- accessanalyzer
---
# Cannot Resolve the Collation Conflict Between SQL_Latin1_General_CP1_CI_AS and SQL_Latin1_General_CP1_CS_AS

## Related Queries

- Cannot resolve the collation conflict between `SQL_Latin1_General_CP1_CI_AS` and `SQL_Latin1_General_CP1_CS_AS`
- `0-Oracle_Servers` job succeeds but other Oracle collection jobs fail
- `SA_Oracle_Columns` collation conflict
- `SA_Oracle_SDD_MatchHits hit_column` collation issue

## Symptom

In Netwrix Access Analyzer, Oracle collection jobs fail with the following SQL exception, while the `0-Oracle_Servers` job may still complete successfully:

```
Error while running script : Script Name [] , portion [66] : System.Data.SqlClient.SqlException (0x80131904): Cannot resolve the collation conflict between "SQL_Latin1_General_CP1_CI_AS" and "SQL_Latin1_General_CP1_CS_AS" in the equal to operation.
```

You may also observe that other database-related collection jobs fail with the same error if using Access Analyzer v11.6.

## Cause

Access Analyzer uses a case-insensitive (CI) database collation by default, while Oracle’s data dictionary is case-sensitive. Certain Oracle-related columns must therefore use case-sensitive (CS) collation.

Schema updates that enforce this are managed through the `SA_SQL_Patches` table. If the required update is not applied or fails to update properly, those columns remain in CI collation, leading to a collation conflict during Oracle data collection.

## Resolution

> **IMPORTANT:** Back up the Access Analyzer database before making schema changes in SQL Server.
1. Run the following SQL script against the Access Analyzer database to update the affected Oracle collection table columns to the required case-sensitive collation:

```sql
SET ANSI_PADDING ON

/****** SA_Oracle_Columns ******/
IF EXISTS (
SELECT 1
FROM sys.columns c
WHERE c.object_id = OBJECT_ID('[dbo].[SA_Oracle_Columns]')
AND c.name = 'column_name'
AND c.collation_name <> 'SQL_Latin1_General_CP1_CS_AS' )
BEGIN
-- Drop the existing constraint
IF EXISTS (
SELECT 1
FROM sys.indexes i
INNER JOIN sys.index_columns ic ON i.object_id = ic.object_id AND i.index_id = ic.index_id
INNER JOIN sys.columns c ON ic.object_id = c.object_id AND ic.column_id = c.column_id
WHERE i.object_id = OBJECT_ID('[dbo].[SA_Oracle_Columns]') AND i.name = 'UQ_SA_Oracle_Columns_names' )
BEGIN
ALTER TABLE [dbo].[SA_Oracle_Columns] DROP CONSTRAINT [UQ_SA_Oracle_Columns_names];
END

-- Alter the column collation
ALTER TABLE COLLATE SQL_Latin1_General_CP1_CS_AS NOT NULL;

-- Recreate the constraint
ALTER TABLE [dbo].[SA_Oracle_Columns] ADD CONSTRAINT [UQ_SA_Oracle_Columns_names] UNIQUE NONCLUSTERED
(
[sa_table_id] ASC,
[column_id] ASC,
[column_name] ASC
);
END


/****** SA_Oracle_SDD_MatchHits ******/
IF EXISTS (
SELECT 1
FROM sys.columns c
WHERE c.object_id = OBJECT_ID('[dbo].[SA_Oracle_SDD_MatchHits]')
AND c.name = 'hit_column'
AND c.collation_name <> 'SQL_Latin1_General_CP1_CS_AS' )
BEGIN
-- Alter the column collation
ALTER TABLE [dbo].[SA_Oracle_SDD_MatchHits] ALTER COLUMN [hit_column] [nvarchar](max) COLLATE SQL_Latin1_General_CP1_CS_AS NULL;
END

GO
```

2. After running the script, run the Oracle collection jobs again and confirm that the collation conflict does not reoccur.
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
---
description: >-
Clear stored SDD match data in Netwrix Access Analyzer by disabling match storage and running a SQL update against the SA_FSDLP_MatchHits table.
keywords:
- SDD matches
- SA_FSDLP_MatchHits
- clear match data
- SEEK System Scan
- sensitive data removal
- FS_SDD_DELETE
- Access Analyzer
products:
- access-analyzer
sidebar_label: Clearing Stored File System SDD Match Data
title: 'Clearing Stored File System SDD Match Data'
---

# Clearing Stored File System SDD Match Data

## Related Queries

- "Delete SDD matches"
- "Clear sensitive data from SA_FSDLP_MatchHits"
- "Remove stored match data Access Analyzer"
- "SDD matches still showing after scan"

## Overview

This article explains how to clear stored File System SDD match data from the database. This approach removes the sensitive data captured in match results while keeping the scan results themselves, such as match counts and criteria.

## Instructions

### Excluding Sensitive Data from the Next Collection
1. Open the **SEEK System Scan** query configuration.
2. On the _Sensitive Data Settings_ tab, uncheck the option **Store discovered sensitive data** to prevent new match data from being stored.

![Disabling SDD match storage in SEEK System Scan configuration](./../0-images/fsaa-store-discovered-sdd.webp)

### Clearing Previously-Stored Match Hits

In the Netwrix Access Analyzer SQL database, run the following SQL statement:

```sql
UPDATE SA_FSDLP_MatchHits
SET MatchData = NULL,
MatchPrefix = NULL,
MatchSuffix = NULL
```

### Removing All SDD Matches

Use the `FS_SDD_DELETE` instant job to remove SDD matches entirely:
- **NAA v12.0:** [InstantJobs\FS_SDD_DELETE](https://docs.netwrix.com/docs/accessanalyzer/12_0/admin/jobs/instantjobs/fs_sdd_delete)
- **NAA v11.6:** [InstantJobs\FS_SDD_DELETE](https://docs.netwrix.com/docs/accessanalyzer/11_6/admin/jobs/instantjobs/fs_sdd_delete)

## Related Link
- [Configure the (SEEK) File System Scan Query](https://docs.netwrix.com/docs/accessanalyzer/12_0/solutions/filesystem/collection/seek_system_scans#configure-the-seek-file-system-scan-query)
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,10 @@ sidebar_position: 20

# .NET Script Actions

A Visual Basic or C# script can be written and assigned to a policy by users or a Netwrix Engineer
via engaging Netwrix Professional Services. The script will be invoked by the Enterprise Manager for
an enabled policy.
Users or a Netwrix Engineer (through Netwrix Professional Services) can write and assign a Visual
Basic or C# script to a policy. The Enterprise Manager invokes the script for an enabled policy.

Follow the steps to add a .NET Script action to a policy/template.
To add a .NET Script action to a policy or template:

![Actions tab - Actions Configurations area](/images/threatprevention/8.0/admin/policies/actions/actionsconfigurationsarea.webp)

Expand Down Expand Up @@ -66,26 +65,25 @@ The Tools menu contains the following options:
- Run (F5) – Executes the script on the machine where the Administration Console is installed. It
launches the script from the Administration Console, allowing the user to test the script. When
running a .NET Script action, there are no prerequisites.
- Reset to Default Script – Replaces the existing script with the default script that is shipped
with Threat Prevention.
- Reset to Default Script – Replaces the existing script with the default script that ships with
Threat Prevention.
- Encrypt – Encrypts selected portions of the script to an encrypted string with a decrypt command
for run time. See note below explaining why only a plain text string, information in thae script
between quote marks (“), should be encrypted.
for run time. See the following note explaining why only a plain text string, information in the
script between quote marks (“), should be encrypted.

:::tip
Remember, when testing a script in the Script Editor, the **Run** option executes the script in
the context of the user logged into the Administration Console. In production, when this script is
run as part of a policy, it will run in the context of the account configured for the Enterprise
Manager. If the script depends on specific user/account rights, then that should be taken into
account when using the **Run** option to test the script.
the context of the user logged into the Administration Console. In production, when this script
runs as part of a policy, it runs in the context of the account configured for the Enterprise
Manager. If the script depends on specific user/account rights, consider that when using the
**Run** option to test the script.
:::


:::warning
The Tools > Encrypt option is used to obfuscate plain text strings, e.g. credentials,
within the script. Encrypting functions or other commands result in the script not working. Only a
literal string should be encrypted, between the quote marks (“). The quote marks themselves should
not be included in the encryption.
The Tools > Encrypt option obfuscates plain text strings, e.g. credentials, within the script.
Encrypting functions or other commands result in the script not working. Encrypt only a literal
string, between the quote marks (“). Do not include the quote marks in the encryption.
:::


Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading