Certification - Microsoft Certified: Fabric Analytics Engineer Associate (opens in a new tab)
Microsoft Learn
Exam Topics
Plan, implement, and manage a solution for data analytics (10–15%)
Plan a data analytics environment
- Identify requirements for a solution, including components, features, performance, and capacity stock-keeping units (SKUs)
- Recommend settings in the
Fabric admin portal - Choose a
data gateway type - Create a
custom Power BI report theme
Implement and manage a data analytics environment
- Implement
workspace and item-level access controlsforFabricitems - Implement
data sharing for workspaces, warehouses, and lakehouses - Manage sensitivity labels in semantic models and lakehouses
- Configure
Fabric-enabled workspace settings - Manage
Fabric capacity
Manage the analytics development lifecycle
- Implement
version control for a workspace - Create and manage a
Power BI Desktop project (.pbip) - Plan and implement
deployment solutions - Perform impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models
- Deploy and manage semantic models by using the
XMLA endpoint - Create and update reusable assets, including
Power BI template (.pbit) files,Power BI data source (.pbids) files, andshared semantic models
Prepare and serve data (40–45%)
Create objects in a lakehouse or warehouse
Ingest databy using adata pipeline,dataflow, ornotebook- Create and manage
shortcuts - Implement file partitioning for analytics workloads in a lakehouse
- Create
views,functions, andstored procedures - Enrich data by adding new columns or tables
Copy data
- Choose an appropriate method for copying data from a
Fabric data sourceto alakehouseorwarehouse - Copy data by using a
data pipeline,dataflow, ornotebook - Add
stored procedures,notebooks, anddataflowsto adata pipeline - Schedule
data pipelines - Schedule
dataflowsandnotebooks
Transform data
- Implement a
data cleansingprocess - Implement a
star schemafor a lakehouse or warehouse, includingType 1andType 2slowly changing dimensions - Implement
bridge tablesfor a lakehouse or a warehouse DenormalizedataAggregateorde-aggregatedataMergeorjoindata- Identify and resolve
duplicate data,missing data, ornull values - Convert data types by using
SQLorPySpark Filterdata
Optimize performance
- Identify and resolve data loading performance bottlenecks in
dataflows,notebooks, andSQL queries - Implement
performance improvementsindataflows,notebooks, andSQL queries - Identify and resolve issues with
Delta table file sizes
Implement and manage semantic models (20–25%)
Design and build semantic models
- Choose a
storage mode, includingDirect Lake - Identify use cases for
DAX StudioandTabular Editor 2 - Implement a
star schemafor asemantic model - Implement
relationships, such asbridge tablesandmany-to-many relationships - Write calculations that use
DAXvariables and functions, such asiterators,table filtering,windowing, andinformation functions - Implement
calculation groups,dynamic strings, andfield parameters - Design and build a large format dataset
- Design and build
composite modelsthat includeaggregations - Implement dynamic
row-level securityandobject-level security - Validate
row-level securityandobject-level security
Optimize enterprise-scale semantic models
- Implement performance improvements in queries and report visuals
- Improve
DAXperformance by usingDAX Studio - Optimize a
semantic modelby usingTabular Editor 2 - Implement
incremental refresh
Explore and analyze data (20–25%)
Perform exploratory analytics
- Implement
descriptive and diagnostic analytics - Integrate
prescriptive and predictive analyticsinto a visual or report - Profile data
Query data by using SQL
- Query a lakehouse in Fabric by using
SQL queriesor thevisual query editor - Query a warehouse in Fabric by using
SQL queriesor thevisual query editor - Connect to and query datasets by using the
XMLA endpoint
Services
Power Query
Dataflows (Gen2)
Azure Data Factory
- Managed, serverless ETL/ELT service
- SSIS (SQL Server Integration Services) in the cloud
Azure Data Factory - Data Factory Pipelines
Data Factory Pipelines can be used to orchestrate Spark, Dataflow, and other activities; enabling you to implement complex data transformation processes.
Microsoft Fabric
Capacity
-
Key points
- A
Microsoft Fabriccapacityresides on atenant. - Each
capacitythat sits under a specifictenantis a distinct pool of resources allocated toMicrosoft Fabric.
- A
-
Benefits
-
Centralized management of capacity
Rather than provisioning and managing separate resources for each workload, with
Microsoft Fabric, your bill is determined by 2 variables:-
The amount of compute you provision
- A shared pool of capacity that powers all capabilities in
Microsoft Fabric. Pay-as-you-goand 1-year Reservation
- A shared pool of capacity that powers all capabilities in
-
The amount of storage you use
- A single place to store all data
Pay-as-you-go(billable per GB/month)
-
-
Capacity License SKUs
Capacity licensesare split intoSKUs. EachSKUprovides a set of Fabric resources for your organization. Your organization can have as many capacity licenses as needed.
Capacity Unit
-
Capacity unit (CU)= Compute Power -
CUConsumptionEach capability, such as
Power BI,Spark,Data Warehouse, with the associated queries, jobs, or tasks has a unique consumption rate.
Access Control
-
Tenant -
Capacity -
Workspace -
ItemData Warehouse, Data Lakehouse, Dataflow, Semantic Model, etc.
-
Object
Table, View, Function, Stored Procedure, etc.
Workspace
Workspaceis created under acapacity.Workspaceis a container forMicrosoft Fabricitems.
Workspace - License mode
-
Microsoft Learn - Microsoft Fabric concepts and licenses (opens in a new tab)
-
The
workspace license modedictates what kind ofcapacitytheworkspacecan be hosted in and as a result the capabilities available.
Workspace - Roles
-
Workspace rolesapply to allitemsin theworkspace -
Roles in workspaces in Microsoft Fabric (opens in a new tab)
-
Admin- Update and delete the workspace
- Add or remove people, including other admins
-
MemberEverything an admin can do, except the above two.
- Add members or others wtith lower permissions
- Allow others to reshare items
-
ContributorEverything a member can do, except the above two.
-
ViewerRead-only access to the workspace without API access.