Abhishek Agrawal Azure Cloud Technology, SQL,Power BI
3 Reviews

Azure Data Engineer | Microsoft Certified | 7 Years of Azure Cloud Experience | NIT RAIPUR GRADUATE | 2 Years of Teaching Experience | AZURE DATAFACTORY | AZURE DATABRICKS | AZURE SYNAPSE ANALYTICS | ETL | DP-203 | DP-900 | AZ-104 | AZ-900 | BIG DATA, SPARK, PYSPARK, SQL, PYTHON, SPARKSQL | Azure Solution Architect | Azure Admin | Data Analyst | HELPED 50+ People to come into Azure Cloud from Scartch.

-All Classes with all Lab sessions, Daily Practice and almost zero Theroy.
-End to End Project Implementation using Azure.
-Trained people in becoming Microsoft certified cloud developer from scratch.
-Helped people across the globe from India, Australia, Finland, Norway, Canada and US.
-Get online recording of classes with lifetime access.
-Helped experienced IT people to Learn adn secure good job in Azure cloud technology.
-Providing support in interview preparation and Resume Building.
-Support for fresher's to secure good jobs as Business Analyst, Data Analyst, Data Engineer in 2-3 Month of Training
-Helped people to move from Informatica, SSIS, SSRS, SSAS to Azure Cloud Enviornment.
-Providing Microsoft CertificationTraining and Support.
-Practice is key to success.
-I believe in Practical Implementation instead of theoretical knowledge.

Brief About Me:
I'm an Azure Data Engineer offering 7+ years of experience with proven ability to deliver short or long-term projects in data engineering, data warehousing, machine learning, and business intelligence realm. My passion is to partner with my clients to deliver top-notch, scalable data solutions to provide immediate and lasting value. I have completed my engineering (B.Tech) from NIT RAIPUR.

I specialize in the following data solutions:
✔️ Builiding End to End ETL Pipeline Using Azure Cloud Tools.
✔️ Building data warehouses using modern cloud platforms and technologies.
✔️ Creating and automating data pipelines, & ETL processes
✔️ Building highly intuitive, interactive dashboards.
✔️ Data Cleaning, Processing, and Machine Learning models.
✔️ Data strategy advisory & technology selection/recommendation

Technologies I most frequently work with are:
☁️ Cloud: Azure
☁️ Cloud Tools: Azure Data Factory, Azure Syanpse Analytics, Azure Databricks, Azure Data Lake, Azure Analysis Service, Azure DevOps, Azure Key Vault, Azure Active Directory.
💬Language: SQL, Python, PySpark, SparkSQL, R, SAS, Dash.
👨‍💻 Databases: SQL Server, Azure Syanpase, Azure SQL Database
⚙️ Data Integration/ETL: SAP HANA, Dyanmics 365, EPM Onyx, QAD
📊 BI/Visualization: PowerBI, Excel
🤖 Machine learning - Jupyter Notebook, Python, Pandas, Numpy, Statistics, Probablity.

Subjects

  • SQL Beginner-Expert

  • Azure Beginner-Expert

  • Power BI Beginner-Intermediate

  • Azure Data Factory Beginner-Expert

  • Azure DevOps Beginner-Expert

  • Azure Data Engineer Beginner-Expert

  • Azure DataBricks Beginner-Expert

  • AZ 104 (Azure administrator)

  • AZ 900

  • Azure Synapse Beginner-Expert

  • Azure Data Lake Beginner-Expert

  • AZURE Analysis Service Beginner-Expert

  • DP 203

  • DP 900


Experience

  • Azure Data Engineer (Apr, 2023Present) at ALDI Sued
    -Spearheading the migration process from Hadoop cluster to Azure Databricks spark cluster
    -Ensuring optimal code performance in the spark cluster and constantly seeking out ways to improve it
    -Industrializing the code base to facilitate seamless scaling
    -Taking the lead in forming the CI/CD process for both Azure Data Factory and Azure Databricks
    -Collaborating closely with Data Scientists to provide them with accurate data to derive actionable business insight
  • AZURE DATA ENGINEER (Aug, 2021Present) at SMITHS DETECTION
    RETAIL HUB DASHBOARD
    -Developed a dynamic spark notebook to pivot the web crawling data efficiently
    -Integrated Azure Synapse Warehouse and Data Lake with ADF and created parameterised pipeline in Data Factory for an end-to-end ETL process.
    -Designing and developing Warehouse Objects and Stored Procedures for streamlined data processing
    -Developing a Logic app to send daily email extracts from Azure Data Lake to the relevant stakeholders
    -Establishing a robust CI/CD process for Azure Data Factory to ensure seamless deployment and management of the pipeline.

    Project Vector HR Dashboard Development:
    1) created warehouse views to report various business KPIs such as Joiners, Leavers, HeadCount, Future Joiners and Leavers. 
    2) created a stored procedure to update previous headcount reports. 
    3) Worked on improving the performance of the dashboard by improving the existing schema and data models.

    HR Dashboard Optimization. 
    1) Increased the speed of Pyspark, Python code written in databricks. 
    2) Made several changes to the code, such as replacing collect () and executemany () with spark writer and fetchall () with the spark.read jdbc function. 
    3) Adapted Python commands and integrated them with Spark so that they could run in driver mode intead of worker node. 
    4) Optimization gave us a 20% reduction in time and a 30% reduction in overall cost. 
    5) Removed manual intervention in the HR Data Load pipeline and worked on automating it. 
    6) added a feature for deleting source records to the existing dashboard. 7)Analysed the existing dashboard with business people to find out the descrypency in the existing dashboard and worked with them to remove those anomalies by adding functionality using ADF, ADLS, and Azure Databricks.

    EPM ONYX BUSINESS REPORTING

    1)Gathered requirements from stakeholders for Business Reporting.
    2)Prepared Technical & Scope Analysis document for Data Models which includes Facts & Dimensions mapping.
    3)Fetched Data from a Multidimensional source system using API through Azure Data Factory.
    4)Built Full load and Delta load Pipeline in Azure Data Factory.
    5)Stored the raw data into Azure Data Lake in the Date Time folder structure.
    6)Written transformation logic using Pyspark and spark SQL in data bricks to convert the raw data into Facts and Dimensions.
    7)Implemented Semantic Data Model using Azure Analysis Services for various
    Analytics Power BI reports/dashboards.
    8)Created a technical incident/challenge Document for effective communication
    across the team.

    MLT O2C DASHBOARD
    -Added a new report in existing order to the cash dashboard to track market lead time.
    -Read the data uploaded by users through Logic App into Azure Blob Storage using Azure Databricks.
    -Made required transformation in Databricks using SQL and Python and wrote the transformed data into Azure Blob storage.
    -Read the Data into Azure Synapse and made Data Models in Azure Analysis Service for Power BI Visualization.

    Disaster Recovery and CI/CD Pipeline
    -Taken backup for Azure Databricks, Azure Data Factory, Azure Analysis Service using different methods and tools such as Databricks Command-Line, ARM templates, SSMS, and Powershell.
    -Designed a CI/CD pipeline using Azure DevOps to automate the build and release processes
    across various environments for Azure Data Factory (ADF), Azure Synapse (Data
    Warehouse) & Azure Analysis Services

    KPI REPORTING USING AZURE
    1)Migrated data from On-Premise server to Azure Data Lake using Azure Data Factory
    and processed CSV, JSON, XML files using Scala, PySpark, Spark SQL in Databricks.
    2)Written processed file to Azure SQL and Delta Lake using Databricks and moved it to
    Archive container in Databricks.
    3)Built Dashboard on top of Delta Table and Orchestrated the whole process using
    Azure Data Factory
  • AZURE DATA ENGINEER (Nov, 2018Mar, 2021) at Syniti
    1)Built a data integration pipeline from SAP S/4HANA and Dynamics 365 source systems with the help of Azure Data Factory.

    2)Created a date hierarchy partitions in Azure Data Lake Storage Gen 2 to store the data in multiple layers.

    3)With Azure Synapse (Azure Data Warehouse) created Schema, Facts and Dimensions to organize and populate the data into table structures for various source systems with Stored Procedures and created a semantic layer for data modelling in Azure Analysis Services to build and deploy multiple
    interactive analytics dashboards or reports using Power BI.

    4)Developed an internal Data Science Platform which can automate the various DS processes thereby reducing the computational complexity and time required to solve specific problem statements.

    5)Developed a solution for Utility firm by implementing optimization techniques in python to recommend the best optimal route to minimize the time taken and distance travelled by vehicles contributing to the reduction in operational cost.

Education

  • AZ 900 (Azure Fundamental) (Oct, 2021Oct, 2021) from Microsoft Certified in Azure
  • Data Fundamental (Jun, 2021Mar, 2022) from Microsoft Certified
  • AZ-104 Microsoft Certified Azure Adminstrator (Jun, 2020Mar, 2022) from Microsoft Certified
  • DP 203 (May, 2020Feb, 2022) from Microsoft Certified Data Engineer
  • B.Tech (Apr, 2012May, 2016) from NIT RAIPUR, Raipur

Fee details

    8002,000/hour (US$9.5923.96/hour)


Courses offered

  • Azure Data Engineering

    • 20000
    • Duration: 30
    • Delivery mode: Online
    • Group size: 5
    • Instruction language: English
    • Certificate provided: Yes
    1.What is subscription,resource group, management group and how to create adf? What is blob and data lake gen2
    2. What is integration run time and linked service?
    3. What is data set? Pipeline and activity? And copy data for only one file
    4. Recursive, wildcard,list of file,mapping, user properties and different setting of copy data activity
    5. For each loop (copy multiple file)
    6.copy data to sql server(upsert,precopy script, Autocreate table option)
    7.lookup activity and for each for multiple file using lookup
    8.delta load for single file
    9.delta load for multipe file.
    10. Git integration with azure
    11.azure devops integration with azure
    12. Azure databricks overview and integration with azure
    13. Delta table and properties of delta lake
    14.azure active directory and teant,subscription,users
    15. If else activity,meta data activity,web activity
    16.triggers, event based trigger, schedule time trigger, tumbling window trigger
    17. Azure sql synapse,server less pool, dedicated pool
    18.Data flow,mapping data flow,wragling data flow
    19.azure key vault and shared integration run time
    20.jdbc connector and secret scope of azure databricks.
    21)fact,dimension,star and snowflake schema and data modelling
    22)creating stored procedure for data modelling
    23)sending mail alerts using logic app
    24)csv,json,xml processing using pyspark,scala,python.
    25)delta lake
  • Big Data and Spark Using Azure Databricks

    • 20000
    • Duration: 30
    • Delivery mode: Online
    • Group size: 5
    • Instruction language: English
    • Certificate provided: Yes
    Course Content:
    1)What is Databricks SQL?
    2)What is Databricks Data Science Data Engineering Workspace
    3)What is Databricks Machine Learning?
    4)Fundamentals of the Databricks Lakehouse Platform Accreditation
    5)Databricks Architecture and Services
    6)Create and Manage Interactive Clusters
    7)Notebook Versioning and Databricks Repos
    8)What is Delta Lake and Managing Delta Tables
    9)Manipulating tables with Delta Lake
    10)Advanced Delta and Delta Lab
    11)Relational Entities
    12)Database and Views
    13)ETL with Spark SQL
    14)Optional DataProcessing
    15)MultiHop Processing
    16)Delta Live Tables
    17)Task Orhestration with Databricks Jobs
    18)Architecture of Lakehouse
    19)Broze Ingestion Patterns
    20)Promoting to Silver
    21)Gold Query Layer
    22)Storing Data Securly
    23)Propagating Updates and Deletes
    24)Orchestration and Secheduling
    25)Conclusion

3 Reviews
4.7 out of 5

User Photo January 17, 2023
Payment verified US$ 25

Azure ADF/ SQL

Know the subject very well. professional and reasonable charge. helped me omn my project beyond my expectation.


User Photo June 12, 2022
Payment verified US$ 1 (50 Coins)

Azure Data engineer tutor

Abhishek simplifies the most complicated concepts as much as possible for students to understand. His support is extremely warm and encouraging. I would advise anyone looking to learn data engineering on Azure to contact him you will learn relevant skills required to land your dream job. Thank you Abhishek!


User Photo June 11, 2022
Payment verified US$ 1.98 (100 Coins)

Hands on experience on ADF

Abhishek has good knowledge of ADF. Given hands on experience, explained real time scenario, helped in interview preparation. Overall good experience.