Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Engineer Python, SQL, PySpark.
Chile Jobs Expertini

Urgent! Data Engineer - Python, SQL, PySpark Job Opening In WorkFromHome – Now Hiring Terminal

Data Engineer Python, SQL, PySpark



Job description

Join to apply for the Data Engineer - Python, SQL, PySpark role at Terminal

Join to apply for the Data Engineer - Python, SQL, PySpark role at Terminal

About Sourcemap
Sourcemap is a pioneer of supply chain transparency and traceability software that spun out of MIT research started in 2008.

Since then major traders, manufacturers, and brands have adopted Sourcemap's full-suite solution for assurance on the raw materials-to-finished goods supply chain, including ongoing monitoring for production, quality, sustainability, and risks such as deforestation and forced labor.

About Sourcemap
Sourcemap is a pioneer of supply chain transparency and traceability software that spun out of MIT research started in 2008.

Since then major traders, manufacturers, and brands have adopted Sourcemap's full-suite solution for assurance on the raw materials-to-finished goods supply chain, including ongoing monitoring for production, quality, sustainability, and risks such as deforestation and forced labor.
About The Role
Company Overview Sourcemap is the leading provider of supply chain mapping, traceability, and transparency software.

We are the only full suite supply chain transparency and traceability solution on the market.

Our clients include category-leading global brands, manufacturers and suppliers across the food & agriculture, fashion, beauty, manufacturing and electronics industries.

We turn these clients into best-in-class responsible sourcing organizations.

We seek committed individuals who will join our team to support our award-winning, values-led work and to tackle important supply chain challenges in a dynamic startup environment.

About the Job: Sourcemap is seeking an experienced Data Engineer to join their growing engineering team.

You would be joining an enthusiastic and collaborative team of engineers in a fully remote position.

This role has a strong hands-on component as you would be building and deploying new features to their platform.

You would also take part in the development and mentorship of junior team members by sharing best practices and prior experiences.
What You’ll Do

  • Development and maintenance of ETL pipelines and ELT pipelines
  • Data Warehouse/Data Lake Development and management
  • API Development
  • Assembling large, complex sets of data that meet non-functional and functional business requirements
  • Optimizing data delivery
  • Automating Manual data processing procedures
  • Working with stakeholders including data scientist, design, product teams, assisting them with data-related technical issues
  • Excellent analytic skills associated with working with unstructured datasets
What You’ll Bring
  • 5+ years writing production-ready Data Pipelines (Python, SQL, Scala etc)
  • 3+ years with relational SQL and NoSQL databases
  • 2+ years of experience using geospatial tools, data and techniques(ESRI, QGIS, GDAL, Geopandas, handling multiple geometry types, coordinate transformations etc)
  • A successful history of manipulating, processing and extracting value from large disconnected datasets
  • Architecture experience (Microservice, AWS management, etc)
  • Experience using Big Data Tools such as Databricks, Snowflake, Spark etc
  • Project Management
  • Other experience: Node, Javascript, GO, Git, Debugging
  • Comfortable with Data Visualization & Web Mapping concepts Other Skills & Qualifications:
  • Effective listening, verbal and written communication skills.

    Demonstrates openness to others’ ideas.

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

  • Ability to adapt fast and be agile in a fast-moving environment
  • Enjoys working in a collaborative environment
  • Goal-oriented and a self-starter
  • Ability to multitask, prioritize and manage time effectively in order to meet demanding deadlines
  • A natural leader; enjoys mentoring and guiding junior developers
  • Enthusiastic and positive team player

Seniority level

  • Seniority level

    Mid-Senior level

Employment type

  • Employment type

    Full-time

Job function

  • Job function

    Information Technology
  • Industries

    Software Development

Referrals increase your chances of interviewing at Terminal by 2x

Sign in to set job alerts for “Data Engineer” roles.

Santiago, Santiago Metropolitan Region, Chile 1 month ago

Santiago, Santiago Metropolitan Region, Chile 3 weeks ago

Santiago, Santiago Metropolitan Region, Chile 1 month ago

Santiago, Santiago Metropolitan Region, Chile 2 weeks ago

Santiago, Santiago Metropolitan Region, Chile 2 weeks ago

Santiago, Santiago Metropolitan Region, Chile 2 weeks ago

Freelance Data Analyst (Python) - AI Trainer

We’re unlocking community knowledge in a new way.

Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr


Required Skill Profession

Bases De Datos, Analítica Y Bi



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Engineer Potential: Insight & Career Growth Guide


  • Real-time Data Engineer Jobs Trends in WorkFromHome, Chile (Graphical Representation)

    Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph below. This graph displays the job market trends for Data Engineer in WorkFromHome, Chile using a bar chart to represent the number of jobs available and a trend line to illustrate the trend over time. Specifically, the graph shows 15711 jobs in Chile and 13741 jobs in WorkFromHome. This comprehensive analysis highlights market share and opportunities for professionals in Data Engineer roles. These dynamic trends provide a better understanding of the job market landscape in these regions.

  • Are You Looking for Data Engineer Python, SQL, PySpark Job?

    Great news! is currently hiring and seeking a Data Engineer Python, SQL, PySpark to join their team. Feel free to download the job details.

    Wait no longer! Are you also interested in exploring similar jobs? Search now: .

  • The Work Culture

    An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Terminal adheres to the cultural norms as outlined by Expertini.

    The fundamental ethical values are:
    • 1. Independence
    • 2. Loyalty
    • 3. Impartiality
    • 4. Integrity
    • 5. Accountability
    • 6. Respect for human rights
    • 7. Obeying Chile laws and regulations
  • What Is the Average Salary Range for Data Engineer Python, SQL, PySpark Positions?

    The average salary range for a varies, but the pay scale is rated "Standard" in WorkFromHome. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.

  • What Are the Key Qualifications for Data Engineer Python, SQL, PySpark?

    Key qualifications for Data Engineer Python, SQL, PySpark typically include Bases De Datos, Analítica Y Bi and a list of qualifications and expertise as mentioned in the job specification. Be sure to check the specific job listing for detailed requirements and qualifications.

  • How Can I Improve My Chances of Getting Hired for Data Engineer Python, SQL, PySpark?

    To improve your chances of getting hired for Data Engineer Python, SQL, PySpark, consider enhancing your skills. Check your CV/Résumé Score with our free Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.

  • Interview Tips for Data Engineer Python, SQL, PySpark Job Success
    Terminal interview tips for Data Engineer   Python, SQL, PySpark

    Here are some tips to help you prepare for and ace your job interview:

    Before the Interview:
    • Research: Learn about the Terminal's mission, values, products, and the specific job requirements and get further information about
    • Other Openings
    • Practice: Prepare answers to common interview questions and rehearse using the STAR method (Situation, Task, Action, Result) to showcase your skills and experiences.
    • Dress Professionally: Choose attire appropriate for the company culture.
    • Prepare Questions: Show your interest by having thoughtful questions for the interviewer.
    • Plan Your Commute: Allow ample time to arrive on time and avoid feeling rushed.
    During the Interview:
    • Be Punctual: Arrive on time to demonstrate professionalism and respect.
    • Make a Great First Impression: Greet the interviewer with a handshake, smile, and eye contact.
    • Confidence and Enthusiasm: Project a positive attitude and show your genuine interest in the opportunity.
    • Answer Thoughtfully: Listen carefully, take a moment to formulate clear and concise responses. Highlight relevant skills and experiences using the STAR method.
    • Ask Prepared Questions: Demonstrate curiosity and engagement with the role and company.
    • Follow Up: Send a thank-you email to the interviewer within 24 hours.
    Additional Tips:
    • Be Yourself: Let your personality shine through while maintaining professionalism.
    • Be Honest: Don't exaggerate your skills or experience.
    • Be Positive: Focus on your strengths and accomplishments.
    • Body Language: Maintain good posture, avoid fidgeting, and make eye contact.
    • Turn Off Phone: Avoid distractions during the interview.
    Final Thought:

    To prepare for your Data Engineer Python, SQL, PySpark interview at Terminal, research the company, understand the job requirements, and practice common interview questions.

    Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Terminal's products or services and be prepared to discuss how you can contribute to their success.

    By following these tips, you can increase your chances of making a positive impression and landing the job!

  • How to Set Up Job Alerts for Data Engineer Python, SQL, PySpark Positions

    Setting up job alerts for Data Engineer Python, SQL, PySpark is easy with Chile Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!