Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Generic
Alexey Popitich

Alexey Popitich

Summary

Results-oriented Lead Data Engineer with 3+ years of experience spearheading data infrastructure initiatives. Proven ability to design, develop, and implement high-performance data warehouse (DWH) architectures, data pipelines, and big data processing solutions. Skilled in independently managing and driving these projects. Possesses a strong understanding of data modeling, ETL processes, and cloud technologies. Eager to leverage extensive experience to contribute to data-driven decision making and unlock business value for your organization.

Overview

3
3
years of professional experience
10
10
years of post-secondary education

Work History

Python Developer

Freelance
11.2022 - Current
  • Engineered robust websites using Django framework, ensuring high performance and maintainability.
  • Developed scalable web scrapers to extract and process large datasets, optimizing data retrieval processes.
  • Implemented e-commerce solutions with integrated cryptocurrency payments, enhancing transaction security and user convenience.
  • Crafted efficient algorithms for web scraping tasks, significantly improving data processing speed and accuracy.

Lead Data Engineer

Karpov.Courses
05.2023 - 08.2024
  • Reduced latency in real-time analytics applications by optimizing query performance through indexing strategies and proper database design principles.
  • Streamlined data ingestion processes to accommodate increasing volumes of incoming information while maintaining data integrity and ensuring timely accessibility across the organization.
  • Achieved cost-effective cloud-based storage solutions for handling big amounts of data, resulting in reduced operational expenses and increased scalability.
  • Developed scalable infrastructure capable of handling big amounts of structured and unstructured data, improving overall system performance.
  • Mentored junior engineers on best practices in data engineering, fostering a culture of continuous learning and improvement within the team.
  • Prepared written summaries to accompany results and maintain documentation.

Data Engineer

Syberry
12.2021 - 10.2023
  • Designed scalable ETL pipelines for improved data ingestion, processing, and storage.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Contributed to internal activities for overall process improvements, efficiencies, and innovation.
  • Prepared written summaries to accompany results and maintain documentation.
  • Onboarded new team members to project and trained them best practices in data engineering, fostering corporate culture.

Big Data Engineer (Trainee)

ISsoft
Minsk
06.2021 - 09.2021
  • Designed and implemented scalable application architecture for movie search utilizing various technologies like pure Python, SQL + Python, Hadoop, and PySpark.
  • Utilized Google Cloud Platform (GCP) for infrastructure management, setting up virtual machines and deploying application clusters for benchmarking purposes.
  • Gained experience in big data processing frameworks like Hadoop and PySpark, laying foundation for future data engineering endeavors.
  • Contributed to performance analysis by evaluating application's efficiency on different configurations.

Education

Master of Science -

BNTU
Minsk
09.2018 - 07.2024

Master of Engineering -

PGATC
Minsk
09.2014 - 05.2018

Skills

Python Development

Accomplishments

    Designed and Developed High-Performance Data Warehouse:

  • Successfully architected and implemented a data warehouse leveraging S3 for scalable storage and ClickHouse for efficient data querying.
  • Quantify the impact - Reduced data processing time by 1.7%.
  • Engineered Robust Data Pipelines:

  • Built robust data pipelines utilizing Airflow for orchestration and S3 for data movement.
  • This ensured reliable and efficient data flow into the data warehouse.
  • Established Maintainable Codebase:

  • Developed a well-structured and documented codebase for the data warehouse and pipelines.
  • This facilitates future maintenance and collaboration.
  • Documented New Infrastructure:

  • Created comprehensive documentation for the newly implemented data infrastructure.
  • This documentation ensures smooth onboarding and knowledge transfer for future team members.

Timeline

Lead Data Engineer

Karpov.Courses
05.2023 - 08.2024

Python Developer

Freelance
11.2022 - Current

Data Engineer

Syberry
12.2021 - 10.2023

Big Data Engineer (Trainee)

ISsoft
06.2021 - 09.2021

Master of Science -

BNTU
09.2018 - 07.2024

Master of Engineering -

PGATC
09.2014 - 05.2018
Alexey Popitich