I recently spoke in Tampa Dev Azure Meeting in a webinar form on 31st Jan 2018. We received interesting questions during the presentation. My aim was to introduce basic concepts of Big Data, Azure Data Lake, Azure Data Lake Store (ADLS), Azure Data Factory (ADF) and Power BI.
I would like to thank Tampa Dev organisers and all attendees for giving the opportunity to speak in this session.
In this session you learn basic concepts of:
- Big Data
- Azure Data Lake
- Azure Data Lake Store (ADLS)
- Azure Data Factory (ADF)
- Azure Analysis Services (AAS)
- Power BI
- And how they relate
Session recording:
You can see and download the presentation file here:
[gview file=”https://www.biinsight.com/wp-content/uploads/2018/02/TampaCC-2017-10-Presentation.pdf”]
Soheil,
I have the same architecture Data Sources -> ADF -> ADLS -> ADLA (U-SQL process and prep to create dimensional model in csv files) -> ADLS -> AAS -> Power BI
as you mentioned in this demo. I am also not using SQL Data Warehouse as AAS nicely connects to ADLS csv files.
In light of Microsoft, no longer promoting ADLA, as they are promoting DataBricks which is PaaS solution and I like SaaS solution such as ADLA.
Any suggestions on how to make the current architecture future proof and enable it to handle Unstructured Data, IoT data in 2019?
Thanks,
Mukesh Dutta
very interesting and the only video tutorial on the internet
I have the same scenario
source – ADF – ADLS Gen2 – ADB – ADLS Gen2 (partitioned csv files)
i want to build a tabular model on the ADLS Gen2
i was able to connect using the blob storage connector and build a model
I face issues when I try to deploy the model to AAS , after I provide the key I get the error
Failed to save modifications to the server. Error returned: ‘The credentials provided cannot be used for the AzureBlobs source. (Source at https://**************.blob.core.windows.net/.). The exception was raised by the IDbCommand interface
any thoughts on what could have went wrong ?