We are looking for a skilled Data Quality Engineer to join our driven team, collaborating on data ingestion processes and enabling the effective utilization of high-quality data across the company.
In this role, you will contribute to an innovative supply-chain data analytics platform that leverages artificial intelligence to transform supply-chain insights and deliver domain-specific recommendations.
Responsibilities
- Define, implement, and maintain tooling and processes for data quality management from scratch
- Build and enforce data quality validations within ETL pipelines
- Set up monitoring dashboards and alerts to ensure continuous data quality tracking
- Collaborate with cross‑functional teams to ingest, profile, and standardize data from multiple external sources
- Leverage SQL and Python to analyze and enhance data quality metrics
- Utilize AWS services, such as reading from S3, for data profiling and validation tasks
- Partner with clients to identify and resolve data quality issues, ensuring effective communication and delivery of solutions
- Work effectively amidst ambiguity and in a fast‑paced startup environment to execute tasks independently
- Foster transparency and proactivity within the team and with stakeholders
Requirements
- Strong experience spanning over 2 years as a Data Quality Engineer or in similar roles, with a proven ability to establish processes and tools
- Showcase of implementing data quality checks in ETL pipelines and creating interactive dashboards and alerts
- Proficiency in SQL and Python for data quality tasks
- Familiarity with AWS services, with a focus on data profiling and reading data from S3
- Background in client‑facing roles, demonstrating excellent communication skills and proactivity
- Capability to independently execute tasks and maintain productivity without constant direction
- Flexibility to adapt to a fast‑paced startup environment with transparent and collaborative workflows
- English level B1+ for effective communication
Nice to have
- Hands‑on experience working with Databricks and PySpark
- Familiarity with multiple data quality monitoring frameworks or tools such as Great Expectations, SODA, Lightup, etc.
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award‑winning culture recognized by Glassdoor, Newsweek and LinkedIn
#J-18808-Ljbffr