So, what happens to the Power BI folks? Nothing changes. You can think of Microsoft Fabric as a Power BI upgrade that gives even more capabilities to upstream workloads. Power BI Desktop will continue to be updated monthly, including announced features like copilot integration to write DAX queries or help you with your report creation in the near future. Power BI Service will still receive updates to integrate with the monthly updates, and your Pro, Premium, and PPU licenses will continue to work exactly as before! In fact, Microsoft Fabric was built on top of the Power BI Premium platform.

As a business analyst myself, I’ve grown alongside Microsoft’s Self-Service Business Intelligence (SSBI) tools, especially Power BI.

I mean, I’ve always been the spreadsheet guy. I’ve grown with Excel’s array formulas (the CTRL+SHIFT+ENTER ones), pivot tables, and later on with the introduction of the first wave of SSBI in the form of Excel add-ins: Power Pivot, Power Query, and Power View… You might know the rest of the story: migrating to Power BI Desktop solutions and then to Power BI Service.

Every time you deliver a complex business solution, more challenging problems arise and we need to dive deep to tackle them. Now, it’s not enough to just know about data visualization; you need to have knowledge in modeling, DAX, star schema, import vs. Direct Query, ETL… On the other hand, each time there is a Power BI upgrade, there are simpler ways to implement clever workarounds to get things done. All these elements transform you into a more versatile data analyst, focusing on the data/tooling part, shifting away from the original business problems that led you to deep dive and solve their issues because there’s simply too much to learn.

“The more you know, the more you realize you don’t know.”

Even though the M language was my first “love” in Power BI, it was also always the problem: it was the only tool available to move and transform data available via Power Query. The larger the model, the more we had to deep dive into the M language and crazy optimizations, simply because, although it is a powerful ETL tool, it’s not the best tool for large amounts of non-query folding compatible data sources. Even by leveraging Dataflows Gen1 enhanced compute engine, it wasn’t enough… something was still off.

Now, with Microsoft Fabric, we get the upgrade we actually needed: new items (formerly known as artifacts) with proper ETL approaches to do the job on whatever tool is more appropriate (even M). Here, by “appropriate,” I mean a tool for which there might be several ways to achieve your goal, and you have the option to use the ones that the team is comfortable with. R, Python, T-SQL, Spark, you name it. All orchestrated by the amazing Data Pipelines, and with the proper way to spread the word of the “single source of the truth” via Lakehouses and/or Warehouses added by a default dataset layer that you can enrich with DAX measures and use it as a source to a composite model to deliver data for decision-makers via a Power BI Report.

Microsoft Fabric is the place where Self-Service BI truly meets Enterprise BI, with proper tooling to handle simple to highly complex scenarios at scale. The simplicity of a SaaS makes it incredibly easy to create powerful workspace items for several end-to-end (e2e) scenarios that broaden the scope of Power BI solutions pre-Fabric.

Data Science e2e:

Data Warehouse e2e:

Lakehouse e2e

Real-time analytics e2e:

Did you notice one thing? Power BI is the common element among every e2e scenario. And it comes with a new revolutionary feature called Direct Lake that enables fresh data consumption seamlessly straight from the single shared storage to all workloads.

The best part is that every engine is being rebuilt (Fabric is still in preview at the time of this writing) from scratch to decouple storage and compute, so they all are optimized to write and read the same file format optimized for analytics: delta parquet, removing the need to copy data to the specific file format of the engine of choice for the specific task. All this anchored in a solid foundation called OneLake, which is the shared logical storage for Fabric solutions.

All this allows for better integrations, writebacks, and infinite architecture solutions to solve one of Power BI’s Achilles’ heels, which was moving the integration and single source of the truth further upstream in the process. It also whets our appetite for knowledge in these new areas. For the short term, the access to new toys (tools) will enable your multidisciplinary data team to provide better solutions faster, simply because Fabric items are designed for deep integration. If you’re more like a solo developer, you might explore things at your own pace, taking advantage of the e2e scenarios and experimenting with new items without commitment. In the near future, we’ll also have Copilot to assist us with that part.

Everything is quite new, and we need time to experiment and propose new architectures for smaller companies with smaller budgets. Possibly the solution will be a mix between a smaller F capacities and Power BI Pro (for Power BI-related consumption). This is still to be evaluated, but I’ve been told that the smallest and more affordable F2 capacity is quite powerful and might fit into small companies’ workloads and budgets rather well.

For larger companies that could already afford a P1 (now equivalent to F64) license, I believe it’s a no-brainer to switch to Fabric. You’ll experience less data duplication, less compute because there will be no need for dataset refreshes, and you’ll broaden the scope with new workloads. However, the investment to upgrade your capacity will provide much more value to the business.

Apart from those crucial but boring licensing considerations, Microsoft Fabric is undoubtedly a game changer. It provides a single platform capable of delivering Self-Service BI alongside Enterprise BI, significantly reducing the activation energy to architect truly integrated solutions, breaking down data silos to deliver value to the business.