We are looking for a
Senior DevOps Engineer
to drive efficient and secure delivery pipelines, ensure robust infrastructure reliability, and uphold security and compliance standards.
This pivotal role requires expertise in automation, cloud technologies, and modern deployment practices to enable our teams to deliver high-quality solutions at scale.
Responsibilities
- Design and implement CI/CD pipelines, optimizing delivery processes and minimizing risks
- Build and maintain scalable infrastructure using infrastructure-as-code practices
- Enhance Identity and Access Management systems to secure data and infrastructure
- Manage cloud-based services (e.g., Databricks Lakehouse, Unity Catalog) and optimize usage across teams
- Oversee and enhance deployment workflows while adhering to security and compliance frameworks
- Actively monitor operational systems to ensure uptime and resolve performance bottlenecks
- Collaborate across teams to establish automation best practices and eliminate manual processes
- Continuously evaluate and implement new tools to improve efficiency in DevOps operations
Requirements
- Proven experience of over 3 years in DevOps practices, with a strong understanding of automation tools and deployment pipelines (e.g., Jenkins, GitLab CI/CD)
- Proficiency in Databricks, including Databricks Lakebase and Databricks Unity Catalog
- Expertise in Infrastructure as Code tooling with capabilities in building and maintaining environments (Terraform, Ansible)
- Background in managing Identity and Access Management systems, ensuring secure authentication and role management
- Knowledge of cloud platforms like AWS, Microsoft Azure, or Google Cloud Platform, including scalable architecture practices
- Familiarity with security compliance processes in DevOps workflows
- Strong problem-solving skills and the ability to work collaboratively in a fast-paced environment
- Excellent communication skills in English, with a minimum proficiency level of B2
Nice to have
- Familiarity with AWS Secrets Manager and Amazon VPC
- Experience with Apache Airflow for pipeline orchestration
- Understanding of Databricks Asset Bundles for job and environment provisioning
- Exposure to monitoring tools like Datadog or PagerDuty to handle on-call responsibilities
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn