Introduction to Power BI (Updated 2023)
An introduction to the Power BI platform updated with 2023 changes to help users better understand architecture, key concepts and the general workflow for building reports.
The MODACO library is our definitive collection of resources for data-driven businesses.
An introduction to the Power BI platform updated with 2023 changes to help users better understand architecture, key concepts and the general workflow for building reports.
Examining the different methodologies available for exporting data from Google Analytics (GA4 Properties) into BigQuery and the pros and cons of each approach.
Let’s look at different ETL tool options for some common paid media data sources. In our evaluation we shall compare coverage, pricing, and sync frequency.
Finding the right Business Intelligence (BI) Tool that will serve your organization’s needs can be challenging. There are a lot of options on the market with varying capabilities and features. Let us look at some key factors one needs to take into consideration when selecting a BI tool.
SQL window functions can be very useful when it comes to high power manipulation of your datasets. However, it is important to understand the importance of using the correct tie breakers to ensure you have consistent results.
With DBT Cloud deprecating support for jobs running on versions earlier than 1.0, all DBT Cloud projects need to be upgraded from 0.x DBT versions by July 1st, 2022. DBT has published a guideline to assist with upgrades to 1.0. Here are some additional tips to help with your migration: On your local dev environment, if you are running into issues with a direct upgrade from 0.x to 1.0, first uninstall all 0.x components which include dbt-core and the dbt-adapters and then install 1.0.0. Run a DBT model in order to ensure that your project compiles correctly without errors on
Snowflake makes it easy to copy data from an S3 bucket (JSON, CSV, parquet) into a Snowflake table either using its bulk load option or Snowpipe, for continuous loading. In this blog we will discuss ingestion using the bulk load option and more importantly how to implement parallel processing using Snowflake’s task trees. COPY INTO: Snowflake Bulk Loading Once secure access to your S3 bucket has been configured, the COPY INTO command can be used to bulk load data from your “S3 Stage” into Snowflake. Copy Into is an easy to use and highly configurable command that gives you the option to specify a subset
The modern data stack consists of a collection of tools that ingest, store, transform, and visualize you data. The primary components include: A data warehouse, the central store for your data (Snowflake, Big Query) ETL/ELT tools that ingest data into the data warehouse (Fivetran, Stitch) Data transformation tools for cleaning and transforming your data (DBT) BI / reporting tools (Looker, Tableau, Mode) Reverse ETL tools (Census, Hightouch) With the right tools in place, you will be able to rapidly report on your data and efficiently manage and scale your data ecosystem. Data Warehouse At the core of your data