Sr Data Engineer

Yalochat

Yalochat

Data Science
Posted on Jul 29, 2025

Yalo

Hi! We’re Yalo! We’re on a mission to revolutionize how businesses sell in an omnichannel way with our intelligent sales platform and intelligent agents powered by cutting-edge AI. Imagine a world where businesses seamlessly connect with their customers across every channel—offering personalized experiences, anticipating needs, and delivering what they want with ease. That’s the reality we’re building at Yalo.


Born in Latin America and driven by its spirit of innovation, we’re transforming sales for businesses around the globe. From empowering businesses in emerging markets to helping enterprises scale intelligently, we’re redefining how companies engage with their customers and drive growth.


At Yalo, we believe the future of sales is personalized, omnichannel, intelligent, and conversational. Join us as we empower businesses to build stronger relationships and achieve remarkable results worldwide!

Job Summary 🧾

We’re looking for a Senior Data Engineer to help us scale Yalo’s Conversational AI platform. This role is ideal for someone who thrives at the intersection of data engineering, platform reliability, AI readiness, and data governance.

You’ll design and maintain high-quality, production-grade data pipelines, and you’ll also take ownership of our semantic layer, privacy-by-design practices, and DataOps workflows. You’ll collaborate closely with analytics engineers, product managers, and data scientists to make sure that our data foundation supports real-time decisions and future AI models alike.

Your mission?

Design, build, and maintain a world-class data platform that seamlessly powers analytics and AI use cases across Yalo. Ensure that our data is secure, compliant, reliable, and AI-ready — from ingestion to semantic modeling — to enable insights, automation, and scalable decision-making.

What are the responsibilities for this role? 🧠

  • Build and maintain scalable batch and real-time pipelines (dbt Cloud, BigQuery, Kafka, Airflow).

  • Implement and manage CI/CD and testing for data workflows across environments.

  • Own and evolve the semantic data layer, ensuring it aligns with entity modeling and business needs.

  • Monitor, debug, and optimize pipeline performance with observability and alerting in place.

  • Ensure data privacy, access control, and regulatory compliance (e.g., GDPR, LGPD).

  • Automate ingestion, transformation, and ELT workflows across structured/unstructured sources.

  • Collaborate with Analytics Engineers to deliver data products used in dashboards and experimentation.

  • Partner with Data Scientists to prepare and deploy features and datasets for ML/AI use cases.

  • Contribute to the overall data architecture strategy, including governance, security, and reliability.
  • Stay up to date with the latest trends in DataOps, GenAI infrastructure, and platform engineering.

Job Requirements?💻

  • 4+ years of experience in Data Engineering (cloud-first, preferably in fast-paced or AI-oriented environments).

  • Strong SQL and Python skills for transformations, automation, and debugging.

  • Deep experience in dbt Cloud and Airflow/Composer orchestration tools.

  • Experience in GCP (BigQuery, Pub/Sub, IAM) and event-based architectures.

  • Solid understanding of data privacy, PII management, and access/security frameworks.

  • Experience implementing CI/CD workflows for data pipelines and maintaining test coverage.

  • Familiarity with Kafka or similar event streaming tools.

  • Bonus: experience with Looker, Databricks, Confluent, or feature stores.
  • Understanding of software engineering principles (code modularity, documentation, review processes).

Soft Skills that matter to us🫀

  • AI Mindset – Curious and proactive about using AI tools (e.g., Copilot, LLMs) to augment development, automate tasks, and improve systems. Thinks in terms of how data systems enable machine learning, automation, and intelligence.
  • Ownership Mentality – Proactively takes responsibility for pipeline reliability, data quality, and long-term maintainability.
  • Communication – Ability to explain technical concepts to non-technical stakeholders and write clear documentation.

Emotional Intelligence – Demonstrates empathy and self-awareness, contributing to a healthy engineering culture.

Metrics to measure 📈

  • Cost Efficiency of Data Platform Cloud spend monitoring and optimization; trend in cost-per-pipeline or cost-per-query over time.
  • Data Quality Coverage – % of tables with automated tests, documentation, and monitoring enabled.
  • Model/Asset Reusability – # of downstream users/teams leveraging built models or datasets..
  • AI/ML Enablement – # of successful data products or features used by ML/AI initiatives.

What do we offer? 🥰
  • Unlimited PTO policy
  • Competitive rewards on the market range
  • Worklife-Personal Life Integration
  • Start-up environment
  • International teamwork
  • You and nothing else limit your career here

We care,
We keep it simple,
We make it happen,
We strive for excellence.

At Yalo, we are dedicated to creating a workplace that embodies our core values: caring, initiative, excellence, and simplicity. We believe in the power of diversity and inclusivity, where everyone's unique perspectives, experiences, and talents contribute to our collective success. As we embrace and respect our differences, we strive to create something extraordinary for the benefit of all.
We are proud to be an Equal Opportunity Employer, providing equal opportunities to individuals regardless of race, color, religion, national or ethnic origin, gender, sexual orientation, gender identity or expression, age, disability, protected veteran status, or any other legally protected characteristic. Our commitment to fairness and equality is a fundamental pillar of our company.


At Yalo, we uphold a culture of excellence. We constantly challenge ourselves to go above and beyond, delivering remarkable results and driving innovation. We encourage each team member to take initiative and make things happen, empowering them to bring their best ideas forward and contribute to our shared goals.