Make an impact
of your own.

Analytics Engineer

Solvento

Solvento

Data Science
Posted on Jun 28, 2025

About Solvento


At Solvento, we simplify payments to carriers through technology and financing, ensuring error-free and immediate payments.
We prioritize putting our customers first by creating high-quality, beautiful, and functional products that deliver value to everyone involved.
Our success wouldn’t be possible without our people, which is why our team is our top priority. We are committed to fostering an exceptional work environment where Solvento team members can thrive, grow, achieve their goals, and collaborate while having fun.
Why are we here?
We make financial processes in the transportation industry incredibly simple. Backed by an extraordinary team, we exceed the expectations of our customers, employees, and investors. We focus on developing technology-driven products that address the pain points of Mexico’s freight trucking sector.
Our Goal
To become the leading provider of technology solutions that drive the growth of transportation companies across the Americas.
Our Milestone
We’re proud to share that we have successfully raised $12M USD in our Series A funding round, empowering us to further our mission and scale our impact.

--
Objective of the position:

1. Translate Business Needs into Scalable Data Solutions

  • Work closely with stakeholders to capture use-cases and design data models in BigQuery via dbt.
  • Build analytics-friendly datasets that power self-serve dashboards and experimentation.

2. Optimize & Automate Data Workflows

  • Refactor legacy SQL into modular, version-controlled dbt flows.
  • Orchestrate ELT pipelines using dbt cloud, integrating CI/CD tests.
  • Monitor and optimize compute and storage spend to keep costs in check.

3. Elevate Data Quality & Infrastructure

  • Implement automated tests, lineage, alerting, and SLA tracking to ensure reliability and governance.
  • Plan for scalability and performance tuning in BigQuery.

4. Build Insightful Data Products

  • Partner with all departments to deliver dashboards, curated metrics and data products.
  • Maintain a consistent metrics layer and drive adoption of self-serve analytics.

5. Document & Continuously Improve

  • Own analytics documentation and keep it current.
  • Lead code reviews, debugging sessions, and raise the analytics bar.
  • Apply software-engineering best practices (testing, modularity, version control) to all data work.

What Sets You Up for Success

  • Detail-oriented mindset – testing, quality checks, automation, and clear documentation.
  • Deep-dive learning & rapid application – you love diving into unfamiliar domains, mastering the details fast, and immediately putting new knowledge to work.
  • High-pace stamina – you have the grit to excel in a demanding, fast-moving environment; it’s hard, but the impact (and reward) is huge.
  • Business curiosity & ownership – you want to understand the “why” behind every metric and own data end-to-end, from ingestion to the dashboard in a stakeholder’s hands.
  • Scalability & cost awareness – you design queries, models, and pipelines that perform well today and scale efficiently tomorrow.
  • Collaborative spirit – you enjoy constant interaction with non-technical teams, translating complex concepts into clear, actionable insights.

Necessary Skills & Experience

  • Education & Foundations
    • Bachelor’s degree in Computer Science, Engineering, Mathematics, Economics, or a related field (advanced analytics certificates are a plus).
  • Analytics-Engineering Track Record
    • 5 + years in analytics engineering, data engineering, or BI, delivering production-grade data models, pipelines, and dashboards.
    • Hands-on ownership of end-to-end dbt (or equivalent) projects: models, tests, exposures, CI/CD, and documentation.
  • Core Technical Expertise
    • Cloud warehouse experience ideally BigQuery, but Snowflake, Redshift, Databricks, or similar are welcome.
    • dbt proficiency (or comparable framework) plus expert-level SQL: performance tuning, incremental logic, cost-aware design.
    • Solid Python for data transformation, automation, and packaging.
    • Familiarity with a modern BI layer—Metabase, Looker, Power BI, Tableau, etc.
  • Data Architecture & Quality
    • Strong grasp of dimensional modeling, partitioning, incremental loading, and performance optimization.
    • Proven implementation of automated testing, lineage, alerting, and SLA monitoring frameworks to guarantee reliability and governance.
  • Software-Engineering Best Practices for Data
    • Version control (Git), code reviews, modular design, observability, and rigorous documentation baked into every deliverable.
  • Stakeholder Collaboration & Communication
    • Fluent English (written & spoken)
  • Execution & Ownership Mindset
    • Demonstrated success hitting deadlines, driving ambiguous projects to completion, and proactively automating manual workflows.