s*****4

BI & Data engineer
USD 38000
June 3, 2000

About Candidate

Location

Education

I
INTERNATIONAL BUSINESS MANAGEMENT 2023- 2024
Coventry university london campus
M
MECHANICAL ENGINEERING 2017-2021
Raghu Institute for Technology

Work & Experience

B
BI & Data engineer 02/01/2024 - 03/28/2025
Flash Tech IT

Flash Tech IT, (Feb 2024 - Mar 2025)  Designed and implemented automated ETL pipelines using Azure Data Factory (ADF) to extract, transform, and load data from on-premise systems, cloud sources, and APIs into Azure Data Lake Storage (ADLS) and Azure SQL Database.  Utilized Databricks and PySpark to process large datasets, performing data transformations, cleansing, and aggregation to support business analytics and reporting requirements.  Implemented Delta Lake for scalable, high-performance data storage, leveraging ACID transactions, schema enforcement, and data versioning to ensure reliability and quality in data processing workflows.  Created interactive Power BI reports to provide real-time data visualizations, enabling business stakeholders to track key business metrics such as sales performance and operational efficiency.  Led the implementation of Azure DevOps CI/CD pipelines, automating the code deployment process for faster and more reliable data pipeline deployments across environments.  Optimized existing data pipelines for performance and cost efficiency, reducing cloud resource consumption by 30% without compromising the quality and throughput of the data.  Integrated Unity Catalog for centralized data governance, ensuring secure metadata management, data lineage tracking, and easy access control for users and stakeholders.  Set up monitoring and alerting through Azure Monitor to proactively detect and resolve any issues with data pipelines, ensuring smooth operations and high data quality.

D
Data engineer 12/01/2022 - 01/30/2023
Black Rock

Black Rock, (Dec 2022 - Jan 2023)  Built automated data integration workflows with ADF to synchronize data between AWS RedShift, S3, and Azure, consolidating it into a central repository in Azure SQL DB and ADLS.  Orchestrated complex data workflows using Apache Airflow, automating pipeline scheduling, monitoring, and error handling, leading to increased pipeline reliability and reduced manual interventions.  Leveraged Databricks and PySpark to process raw, unstructured data and transform it into usable formats for analytics, improving overall data quality and processing speed.  Integrated Unity Catalog for centralized metadata management, ensuring better data governance and streamlined data accessibility with proper security policies in place.  Designed and developed Power BI reports and dashboards, providing actionable insights into business metrics and enabling real-time decision-making for leadership teams.  Implemented data security protocols in compliance with regulations by incorporating encryption at rest and secure data access management, safeguarding sensitive information.  Automated data pipeline monitoring and alerting using Azure Monitor to ensure high availability and performance of pipelines, reducing data downtime by 20%.  Worked with cross-functional teams to enhance data accessibility and data integration strategies, significantly improving the efficiency of data reporting and analysis.

D
Data engineer 01/03/2022 - 11/30/2022
Progressive Insurance

Progressive Insurance, (Jan 2022 – Nov 2022)  Led the migration of legacy SSIS and SSAS-based ETL processes to ADF, modernizing the data pipeline architecture and improving scalability and flexibility of the solution.  Developed robust ETL pipelines to extract, transform, and load data from SAP BW and SAP BO into Azure SQL DB, consolidating and centralizing enterprise data for real-time reporting.  Applied PySpark in Databricks to perform large-scale data transformations and optimizations, reducing data processing time by 40% and enhancing the quality of transformed data.  Designed and created Power BI dashboards to visualize KPIs and other important metrics, enabling executives to make data-driven decisions quickly and efficiently.  Coordinated the migration of on-premise SAP systems to Azure, optimizing performance and leveraging cloud resources for improved flexibility and cost efficiency.  Implemented data pipeline monitoring and error handling through Azure Monitor to ensure the reliability and accuracy of data flow and reduce troubleshooting time.  Collaborated with data scientists and analysts to improve data quality and accuracy for downstream analytics, helping to meet business requirements more effectively.  Developed documentation and provided training to internal teams, ensuring they could effectively manage and maintain the new cloud-based data infrastructure.

B
Business intelligence developer 05/03/2021 - 12/31/2021
Ford Motors

Ford Motors, (May 2021 – Dec 2021)  Built and optimized efficient ETL pipelines using SSIS, enabling smooth data movement between source systems (such as SAP) and the data warehouse for analytics and reporting.  Developed custom SSRS reports, Power BI, and Tableau dashboards to visualize key performance indicators (KPIs), helping business users track and analyze business performance.  Automated data transformation workflows using Alteryx, reducing manual data preparation tasks by 50% and improving data processing efficiency across teams.  Integrated SAP data into business intelligence systems, providing real-time reporting capabilities for stakeholders and improving operational insights.  Developed and deployed SSAS multidimensional models, enhancing query performance and interactivity in reporting tools, making it easier for users to analyze large datasets.  Regularly optimized ETL processes to improve performance, reduce resource consumption, and speed up data processing times, lowering cloud infrastructure costs.  Worked closely with business stakeholders to design intuitive data visualizations and reporting solutions that aligned with business goals and improved decision-making.  Provided ongoing maintenance and support for ETL pipelines, ensuring that data is accurately transformed and delivered for business intelligence applications.