when you push Dataverse (Dynamics) out to a lake via Synapse Link you have created a new copy of the data that must be protected independently. Below I list the realities, the constraints (what is and is not supported), and concrete mitigations/architecture patterns you can apply focused on serverless SQL pool but covering general options as well.
- Synapse Link writes Dataverse data into ADLSGen2 (lake) as files. Once files exist in the lake, any identity that can read those files can export or copy them. You must treat the lake as the primary protection boundary.
- The Synapse workspace uses a workspace identity / MI (system principal) to read files and run serverless queries; by default that identity is commonly granted Storage Blob Data Reader on the container, if that role is overbroad then downstream users can gain access to data via lake database queries. Restrict the identity scope.
- Dynamic data masking (DDM) and some other built-in SQL features are not supported on serverless SQL pools (DDM is supported on dedicated pools/Fabric SQL). Do not rely on serverless to provide DDM.
- Row level security and column-level security have limited support for serverless external tables; serverless can use views to implement some column restrictions, but native RLS/DDM capabilities are fuller in dedicated SQL pools. Confirm requirements before choosing serverless.
Note:
- If you require true enforced masking, row-level enforcement and enterprise policy that cannot be bypassed, do not rely on serverless SQL pools alone. Use a hardened curated endpoint (dedicated SQL/Fabric SQL) as the enforcement point. Serverless is great for exploration and low-cost queries but it is not a drop in replacement for a secured RLS/DDM capable engine.
- The single biggest operational mistake I see is granting the workspace identity wide storage rights. Lock that down and your risk drops dramatically.
Please 'Upvote'(Thumbs-up) and 'Accept' as answer if the reply was helpful. This will be benefitting other community members who face the same issue.