I wish I could haha.
From my understanding so far, the whole process(getting the data from the source system till it's curated/transformed and imported in PBI) is split into 5 different layers: Source, secure, raw, query and report layer(this is where you create your star schema model). In the query layer you model the data(which is basically the raw data you get from the source) to hubs, links and satellite tables. Hub is a table that will include only the business key(s) of an entity(e.g. customerid of customer entity). Satellite table is the table that will include the metadata of that entity(e.g customer's first name, last name, address etc.). And finally link table is a table which is usually used to connect 2 hub tables together(for example you can have a connection between hub customer table and hub location table). I guess you can already tell that you end up having a very big and complex model and that's where my question comes up. Is it necessary to implement this technique? I hope I will find the answer to it soon. Also in case you might be interested, you can check chapters 4,5 and 6 of this book.
I'm working on creating a data warehouse for a startup I work at. I am relatively new to data engineering just wanted to know if I was going down the right path. I just purchased Building a Scalable Data Warehouse with Data Vault 2.0 and plan on studying the architecture.
I guess I posted this in hopes to find out if people here had the opportunity to create a new data warehouse would the Data Vault architecture be how they go about it.
It depends on the data warehouse architecure you will be implementing.
I recommend you these two books which are reference manuals as well for DEs.
Building a Scalable Data Warehouse with Data Vault 2.0
The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling