Data Engineer (Python, SQL, ETL, Snowflake)
Job Description
Job#: 3011751
Job Description:
Data Engineer (Python, SQL, ETL, Snowflake) – Apex Systems MexicoWho We Are
Apex Systems is a global technology services firm delivering data, cloud, and digital solutions that enable enterprise transformation. Through our Mexico Delivery Center (MDC), we partner with enterprise clients to design, build, and operate scalable data platforms that power intelligent decision-making and customer analytics.
Position Overview
We are seeking a Data Engineer to join our Customer Intelligence team supporting a leading telecommunications client. This role focuses on building and optimizing ETL pipelines, developing stored procedures, and supporting data integration and transformation within a Snowflake-based data ecosystem.
The ideal candidate has strong experience with Python and SQL, hands-on knowledge of ETL frameworks, and familiarity with version control and data orchestration tools. Candidates with foundational Snowflake skills are welcome — the project allows for upskilling and Snowflake badge acquisition ahead of onboarding.
Key Responsibilities- Design, develop, and maintain data pipelines and ETL processes to support customer analytics and intelligence initiatives.
- Write and optimize SQL queries, stored procedures, and data transformations within Snowflake and related environments.
- Develop Python-based automation scripts for data ingestion, validation, and processing.
- Collaborate with business and technical teams to understand data requirements and translate them into scalable solutions.
- Support job orchestration and workflow automation (Airflow or equivalent scheduling tools).
- Ensure code quality through proper use of version control (Git, GitLab) and documentation.
- Monitor, troubleshoot, and optimize ETL workflows for performance and reliability.
- Participate in Agile ceremonies, code reviews, and cross-team collaboration within the Customer Intelligence group.
- 3–6 years of experience as a Data Engineer or ETL Developer in enterprise data environments.
- Strong proficiency in Python for data processing, scripting, and automation.
- Advanced SQL skills, including query tuning and stored procedure development.
- Hands-on experience with ETL frameworks and data transformation pipelines.
- Familiarity with Snowflake (basic to intermediate; upskilling supported).
- Experience with version control systems (Git, GitLab, or similar).
- Understanding of job orchestration tools (Airflow or equivalent).
- Strong analytical, troubleshooting, and communication skills.
- English proficiency (B2+ level) for collaboration with international teams.
- Experience using Airflow for workflow scheduling and orchestration.
- Familiarity with Flyway or other database migration/versioning tools.
- Background in telecommunications, customer analytics, or data intelligence projects.
- Experience working with data warehouses and cloud data platforms.
- Proactive and accountable — able to work independently with minimal supervision.
- Strong problem-solving and analytical mindset.
- Clear communicator, collaborative, and results-oriented.
- Adaptable to changing priorities in a fast-moving environment.
Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including Great Place to Work® and Great Place for Women to Work® in Mexico.
Job#: 3011751
Job Description:
Data Engineer (Python, SQL, ETL, Snowflake) – Apex Systems MexicoWho We Are
Apex Systems is a global technology services firm delivering data, cloud, and digital solutions that enable enterprise transformation. Through our Mexico Delivery Center (MDC), we partner with enterprise clients to design, build, and operate scalable data platforms that power intelligent decision-making and customer analytics.
Position Overview
We are seeking a Data Engineer to join our Customer Intelligence team supporting a leading telecommunications client. This role focuses on building and optimizing ETL pipelines, developing stored procedures, and supporting data integration and transformation within a Snowflake-based data ecosystem.
The ideal candidate has strong experience with Python and SQL, hands-on knowledge of ETL frameworks, and familiarity with version control and data orchestration tools. Candidates with foundational Snowflake skills are welcome — the project allows for upskilling and Snowflake badge acquisition ahead of onboarding.
Key Responsibilities- Design, develop, and maintain data pipelines and ETL processes to support customer analytics and intelligence initiatives.
- Write and optimize SQL queries, stored procedures, and data transformations within Snowflake and related environments.
- Develop Python-based automation scripts for data ingestion, validation, and processing.
- Collaborate with business and technical teams to understand data requirements and translate them into scalable solutions.
- Support job orchestration and workflow automation (Airflow or equivalent scheduling tools).
- Ensure code quality through proper use of version control (Git, GitLab) and documentation.
- Monitor, troubleshoot, and optimize ETL workflows for performance and reliability.
- Participate in Agile ceremonies, code reviews, and cross-team collaboration within the Customer Intelligence group.
- 3–6 years of experience as a Data Engineer or ETL Developer in enterprise data environments.
- Strong proficiency in Python for data processing, scripting, and automation.
- Advanced SQL skills, including query tuning and stored procedure development.
- Hands-on experience with ETL frameworks and data transformation pipelines.
- Familiarity with Snowflake (basic to intermediate; upskilling supported).
- Experience with version control systems (Git, GitLab, or similar).
- Understanding of job orchestration tools (Airflow or equivalent).
- Strong analytical, troubleshooting, and communication skills.
- English proficiency (B2+ level) for collaboration with international teams.
- Experience using Airflow for workflow scheduling and orchestration.
- Familiarity with Flyway or other database migration/versioning tools.
- Background in telecommunications, customer analytics, or data intelligence projects.
- Experience working with data warehouses and cloud data platforms.
- Proactive and accountable — able to work independently with minimal supervision.
- Strong problem-solving and analytical mindset.
- Clear communicator, collaborative, and results-oriented.
- Adaptable to changing priorities in a fast-moving environment.
Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including Great Place to Work® and Great Place for Women to Work® in Mexico.
About Apex Systems, Inc.
Apex Systems is a world class technology services business that incorporates industry insights and experience to deliver solutions that fulfill our clients’ digital visions. We provide a continuum of service from workforce mobilization and modern enterprise solutions to digital innovation to drive better results and bring more value to our clients. Apex transforms our customers with modern enterprise solutions tailored to the industries we serve. Apex has a presence in over 70 markets across the US, Canada, and Mexico.
Apex is a segment of ASGN Inc. (NYSE: ASGN). To learn more, visit www.apexsystems.com.
Apex Systems, Inc. would like you to finish the application on their website.