Microsoft Fabric Updates Blog

Microsoft Fabric logo
Microsoft Fabric logo

Fabric Spark Autotune and Run Series Job Analysis

We are thrilled to announce the public preview of Run Series Analysis, in conjunction with our recent announcement of the Autotune feature at the Fab conference. These two features are designed to help you gain insights into Spark application executions across your recurring runs of your Notebook and Spark job definitions, facilitating performance tuning and …

Environment is now generally available

Exciting news! The environment has officially become a generally available feature within Microsoft Fabric. What is the environment in Microsoft Fabric? The environment serves as a comprehensive container for both your hardware and software settings within Spark. Within this unified interface, you have the ability to select the desired Spark runtime, install libraries, and configure …

Easily connect your data with the new modern get data experience for data pipeline

We are excited to share the new modern get data experience in data pipelines provides an extremely simple way to connect to your data by intuitively browsing different Fabric artifacts through the OneLake datahub and getting you closer to the data in the quickest way possible. We listen to customer feedback from various channels to …

Alerting and acting on data from the Real-Time hub

Microsoft Fabric’s new Real-Time hub makes it easier than ever to discover, preview, and use streaming data from around your organization. But the data in these streams is only as useful as the actions and changes it can drive in your business. Data Activator provides a no-code experience for automatically taking actions when patterns or …

Announcing the Public Preview of Copilot for Data Warehouse in Microsoft Fabric

We are excited to announce the public preview release of Copilot for Data Warehouse in Microsoft Fabric, a groundbreaking AI assistant designed to transform your data warehousing tasks. Data warehouse development can be daunting for SQL developers, especially under tight timelines where insights are needed “yesterday”.  Developers may spend hours writing code, building schemas, documentation, …