Our client is a multinational IT consulting company, with presence in more than 45 countries across five continents.
They are currently seeking a Software Engineer to join their team.
Mission:
As a Software Engineer in the Data Analytics team, you´ll be responsible for building and deploying scalable GenAI solutions.
You´ll collaborate closely with data scientists and contribute to the development of production-ready systems that leverage cutting-edge technologies.
Main responsibilities/functions:
- Develop and deploy GenAI solutions using Python and object-oriented programming (OOP) principles.
- Implement and manage CI/CD pipelines using GitHub Actions.
- Deploy solutions on AWS using CloudFormation.
- Work with frameworks such as LangChain, LangGraph, Semantic Kernel, and Crew.ai.
- Apply knowledge of Large Language Models (LLMs) and Prompt Engineering.
- Build and deploy services using FastAPI on AWS.
- Collaborate with data scientists to refactor analytics solutions for production, ensuring scalability and efficiency.
- Support the development of ML/AI pipelines using Apache Airflow.
Key Competencies:
- Software Development: Ability to build software products aligned with business and technical requirements.
- Software Development Life Cycle (SDLC): Understanding of structured methodologies for delivering and managing software solutions.
- Software Architecture & Design: Ability to translate requirements into scalable and maintainable software designs.
- Technical Knowledge: Familiarity with the technical components and integration of software products.
- Software Testing: Experience designing and executing testing strategies to ensure product quality and reliability.
Requirements:
- Relevant corporate experience working with GenAI frameworks (such as LangChain, LangGraph, Semantic Kernel, and Crew.ai) and deployments.
- Experience building chatbots (not just deploying someone elses work)
- Python or Go is a must-have and OOP principles.
- SQL is a must-have
- Corporate experience implementing and managing CI/CD pipelines using GitHub Actions.
- Corporate experience developing and deploying services using FastAPI.
- Assist in the development of machine learning and AI pipelines using Apache Airflow.
- Strong software development experience and solid understanding of OOP principles.
- Proficiency in Python and/or GoLang for application development.
- Experience with libraries such as FastAPI, LangGraph, and LangChain.
- Hands-on experience with Git in medium to large development teams.
- Excellent communication skills and experience working in cross-functional teams.
- Experience deploying software using GitHub Actions or similar CI/CD tools.
- Familiarity with AWS components such as SageMaker Studio, Lambda, API Gateway, and RDS.
- Strong technical writing skills for documentation and collaboration.
- Advanced verbal and written English communication skills.
If your profile matches the position, please click on SOLICITAR EL PUESTO or click on REMITIR A ALGUIEN to introduce someone you think has a good profile for the position.