Senior Data Engineer (ETL Developer)
Job Description
Job#: 3009922
Job Description:
Senior ETL Developer – Apex Systems Mexico
Who We Are
Apex Systems is a global technology services firm delivering data, cloud, and digital solutions that enable enterprise transformation. Through our Mexico Delivery Center (MDC), we help clients build modern data pipelines, analytics platforms, and cloud-based ecosystems that power intelligent, data-driven decisions.
Position Overview
We are seeking a highly skilled ETL Developer to join our Data Engineering Practice at the Mexico Delivery Center. This role will focus on designing and developing robust ETL pipelines, data transformations, and analytics solutions to support product intelligence and operational decision-making. The ideal candidate combines strong SQL expertise, data pipeline development, and a proactive, analytical mindset with the ability to work in fast-paced, collaborative environments.
Key Responsibilities
- Design, develop, and maintain ETL workflows to extract, transform, and load data from multiple internal sources.
- Build and optimize data pipelines using technologies such as Hive, Pig, Python, and Spark.
- Work with structured and unstructured data within Hadoop or distributed data environments.
- Develop data integration and transformation logic to onboard, cleanse, and convert raw data into meaningful metrics and insights.
- Write complex SQL queries for data validation, analysis, and production processes.
- Collaborate with business and analytics teams to ensure data pipelines meet defined business requirements and performance SLAs.
- Develop and maintain data quality and validation routines to ensure accuracy and consistency across data sets.
- Contribute to the design of data architectures that support both batch and real-time analytics.
- Perform QA and validation of ETL jobs, ensuring performance, scalability, and adherence to technical standards.
- Provide ad-hoc analysis and insights to address specific business or operational questions.
- Participate in Agile ceremonies and collaborate with cross-functional global teams.
Required Skills & Experience
- 5+ years of experience in ETL development, data engineering, or BI roles.
- Strong proficiency in SQL (must be comfortable writing complex, manual queries).
- Hands-on experience with Hive, Pig, and Spark for data transformation.
- Experience with Python for scripting, automation, and data processing.
- Solid understanding of data modeling, data cleansing, and ETL best practices.
- Background working with large-scale distributed data platforms (Hadoop ecosystem).
- Strong communication and problem-solving skills; ability to work collaboratively across teams.
- Proactive and “go-getter” mindset with a focus on continuous improvement.
- English proficiency (B2+ level) for collaboration with global stakeholders.
Bonus Qualifications
- Experience with AWS data services (S3, EMR, Glue, Redshift).
- Knowledge of Linux/Unix environments and shell scripting.
- Familiarity with real-time data ingestion frameworks or streaming platforms (Kafka, Flink).
- Experience building analytics and reporting layers on top of large data sets.
- Exposure to data visualization tools (Tableau, Power BI) for metrics validation.
Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including Great Place to Work® and Great Place for Women to Work® in Mexico.
Job#: 3009922
Job Description:
Senior ETL Developer – Apex Systems Mexico
Who We Are
Apex Systems is a global technology services firm delivering data, cloud, and digital solutions that enable enterprise transformation. Through our Mexico Delivery Center (MDC), we help clients build modern data pipelines, analytics platforms, and cloud-based ecosystems that power intelligent, data-driven decisions.
Position Overview
We are seeking a highly skilled ETL Developer to join our Data Engineering Practice at the Mexico Delivery Center. This role will focus on designing and developing robust ETL pipelines, data transformations, and analytics solutions to support product intelligence and operational decision-making. The ideal candidate combines strong SQL expertise, data pipeline development, and a proactive, analytical mindset with the ability to work in fast-paced, collaborative environments.
Key Responsibilities
- Design, develop, and maintain ETL workflows to extract, transform, and load data from multiple internal sources.
- Build and optimize data pipelines using technologies such as Hive, Pig, Python, and Spark.
- Work with structured and unstructured data within Hadoop or distributed data environments.
- Develop data integration and transformation logic to onboard, cleanse, and convert raw data into meaningful metrics and insights.
- Write complex SQL queries for data validation, analysis, and production processes.
- Collaborate with business and analytics teams to ensure data pipelines meet defined business requirements and performance SLAs.
- Develop and maintain data quality and validation routines to ensure accuracy and consistency across data sets.
- Contribute to the design of data architectures that support both batch and real-time analytics.
- Perform QA and validation of ETL jobs, ensuring performance, scalability, and adherence to technical standards.
- Provide ad-hoc analysis and insights to address specific business or operational questions.
- Participate in Agile ceremonies and collaborate with cross-functional global teams.
Required Skills & Experience
- 5+ years of experience in ETL development, data engineering, or BI roles.
- Strong proficiency in SQL (must be comfortable writing complex, manual queries).
- Hands-on experience with Hive, Pig, and Spark for data transformation.
- Experience with Python for scripting, automation, and data processing.
- Solid understanding of data modeling, data cleansing, and ETL best practices.
- Background working with large-scale distributed data platforms (Hadoop ecosystem).
- Strong communication and problem-solving skills; ability to work collaboratively across teams.
- Proactive and “go-getter” mindset with a focus on continuous improvement.
- English proficiency (B2+ level) for collaboration with global stakeholders.
Bonus Qualifications
- Experience with AWS data services (S3, EMR, Glue, Redshift).
- Knowledge of Linux/Unix environments and shell scripting.
- Familiarity with real-time data ingestion frameworks or streaming platforms (Kafka, Flink).
- Experience building analytics and reporting layers on top of large data sets.
- Exposure to data visualization tools (Tableau, Power BI) for metrics validation.
Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including Great Place to Work® and Great Place for Women to Work® in Mexico.
About Apex Systems, Inc.
Apex Systems is a world class technology services business that incorporates industry insights and experience to deliver solutions that fulfill our clients’ digital visions. We provide a continuum of service from workforce mobilization and modern enterprise solutions to digital innovation to drive better results and bring more value to our clients. Apex transforms our customers with modern enterprise solutions tailored to the industries we serve. Apex has a presence in over 70 markets across the US, Canada, and Mexico.
Apex is a segment of ASGN Inc. (NYSE: ASGN). To learn more, visit www.apexsystems.com.
Apex Systems, Inc. would like you to finish the application on their website.