At Nisum, we are
Success Builders
— more than 2,000 professionals building success through technology across seven countries.
We specialize in
digital consulting, AI-driven development, and tailored solutions
for global industry leaders.
We accompany each client with a consultative approach that goes beyond code: we understand contexts, anticipate challenges, and create real impact.
At Nisum, you'll be part of an
inclusive and multicultural culture
where we invest in your
technical growth through ongoing certifications
, value your expertise in
tech talks
, and care for your well-being with
comprehensive benefits that truly make a difference
.
Join us and unleash your talent while transforming organizations and communities.
We are seeking an experienced and driven
Snowflake
Data Platform Engineer
to join our Data Analytics Platform Engineering team.
This team is at the heart of our enterprise data strategy — responsible for enabling scalable infrastructure, standardized frameworks, and resilient pipelines that power analytics and data-driven decision-making across the organization.
The ideal candidate will have a strong background in data platform infrastructure, cloud engineering, and CI/CD practices, with the ability to troubleshoot complex issues, gather business and technical requirements, and propose effective solutions while collaborating with stakeholders across various teams.
Key Responsibilities
- Maintain and enhance Snowflake infrastructure: account configuration, role-based access controls, usage monitoring, and platform optimization.
- Develop, scale, and maintain ETL/ELT frameworks to support data ingestion and transformation processes from diverse internal and external sources.
- Manage and evolve the Data Lake architecture on AWS S3, ensuring security, organization, and access standards are enforced.
- Act as the gatekeeper for platform-level permissions and entitlements, ensuring consistent implementation of access policies across Snowflake, S3, and other integrated services.
Design, implement, and maintain ingestion processes from:
File shares, SFTP
- Cloud storage (S3)
- Relational databases (SQL Server, PostgreSQL)
NoSQL/document stores (MongoDB, DocumentDB)
Support and integrate with custom ingestion and transformation applications developed in Python, hosted on EKS and EC2.
- Design, manage, and troubleshoot CI/CD pipelines using CircleCI, Octopus Deploy, and other tools for infrastructure-as-code and application delivery.
- Use Git and GitHub to manage codebases, implement branching strategies, and enforce collaboration through peer reviews and version control best practices.
- Collaborate with Data Engineering, DevOps, Security, and Application teams to design creative, scalable, and AWS-native solutions.
- Gather requirements, articulate solution approaches, and participate in technical discussions to align solutions with business needs and platform standards.
- Proactively identify opportunities for optimization, automation, and documentation to improve platform reliability and usability.
- 24h On-call support for one week (7 days) every month
Required Skills & Qualifications
- 5+ years of experience in Data Engineering or Platform Engineering roles.
- Proven expertise with Snowflake: infrastructure design, security model, resource management, and performance optimization.
- Strong proficiency in AWS services, particularly S3, IAM, EC2, EKS, and general networking/security concepts.
- Hands-on experience with CI/CD tools such as CircleCI, Octopus Deploy, and familiarity with GitHub Actions (bonus).
- Proficient in Python development within data-driven environments.
- Solid understanding of data ingestion patterns across structured, semi-structured, and unstructured data.
- Familiarity with orchestrating workloads in Kubernetes/EKS.
- Excellent troubleshooting and debugging skills across infrastructure and data pipelines.
- Strong communication skills: capable of gathering requirements, proposing solutions, and collaborating effectively with cross-functional teams.
- Self-motivated, proactive, and willing to go beyond assigned tasks to improve systems and processes.
- Exposure to Terraform or other infrastructure-as-code tools.
- Experience implementing platform observability and monitoring.
- Familiarity with data governance, metadata management, and platform security best practices.
Location & Time Zone
- Available to work in EST, CST, or PST time zones.
Education
- Bachelor's or Master's Degree in Computer Science or related field, or equivalent combination of education and work experience.
What can we offer you?
- Belong to an international and multicultural company that supports diversity.
- Be part of international projects with a presence in North America, Pakistan, India and Latam.
- Work environment with extensive experience in remote and distributed work, using agile methodologies.
- Culture of constant learning and development in current technologies.
- Pleasant and collaborative environment, with a focus on teamwork.
- Access to learning platforms, Google Cloud certifications, Databricks, Tech Talks, etc.
- Being part of various initiatives and continuous participation in internal and external activities of innovation, hackathon, technology, agility, talks, webinars, well-being and culture with the possibility not only to participate but also to be an exhibitor.
Nisum is an Equal Opportunity Employer and we are proud of our ongoing efforts to foster diversity and inclusion in the workplace.