Job Description
Toast is recruiting on behalf of this company for a Senior Data Engineer based in Scottsdale, Arizona. This hybrid role offers the opportunity to join a mission-driven team building technology to transform workforce travel for the better. The company is a growing digital platform reimagining crew travel management through data-driven, scalable solutions.
As a Senior Data Engineer, you’ll play a central role in shaping a modern, cloud-native data ecosystem that supports innovation, real-time decision-making, and self-serve access across internal and external teams. The ideal candidate is passionate about clean architecture, performance optimization, and building data as a product.
This is an excellent opportunity for a collaborative engineer who values impact, scalability, and mentorship in a supportive and inclusive team environment. This is a hybrid position with 4 days in office and 1 day remote located in Scottsdale, Arizona
Responsibilities
Data Architecture & Engineering
- Design and develop a scalable, microservices-aligned data platform in a cloud-native environment
- Build both real-time and batch data pipelines to power analytics and decision support tools
- Implement diverse storage strategies using SQL, NoSQL, and hybrid models
Data as a Product
- Develop secure, well-documented data APIs to enable self-service access
- Partner with product and business teams to design high-value data products
- Ensure data lineage, quality, and governance across pipelines and storage systems
Microservices Integration & Performance
- Build event-driven systems using tools like Kafka, Azure Event Hub, or Service Bus
- Design efficient ETL/ELT processes for scalable data ingestion and transformation
- Optimize data infrastructure for performance, including indexing and caching
Governance, Security & Compliance
- Apply privacy, security, and compliance standards across all data initiatives
- Implement monitoring and observability to maintain robust data operations
- Integrate with DevSecOps teams to embed security into development workflows
Requirements
Must-Haves
- 5+ years in data engineering, ideally with experience in Data Mesh or data product strategies
- Proficiency in building and managing data pipelines across various architectures
- Expertise in modern storage and processing: SQL, NoSQL (e.g., Cosmos DB, PostgreSQL), Azure Data Lake, Delta Lake, Apache Iceberg
- Strong knowledge of ETL tools like Kafka, Airflow, Flink, Spark, Azure Data Factory, Databricks
- Experience with event-driven architectures and containerized environments (e.g., Azure Service Bus, AWS ECS)
- Familiarity with modern data platforms such as Azure Fabric, Databricks, Snowflake, Apache Hudi
- Advanced API development using GraphQL, REST, or gRPC
- Strong programming skills in Go, Java, and/or Python
- Understanding of data governance tools (e.g., Microsoft Purview, Apache Ranger, OpenLineage)
- Experience with Infrastructure as Code using Bicep, Terraform, or CloudFormation
Nice to Have
- Exposure to ML and AI workflows using Azure Machine Learning or Databricks ML
- Familiarity with Jupyter, Synapse, or Databricks notebooks
- Knowledge of real-time streaming architectures
- Experience with federated data governance or self-serve data platforms
- Understanding of data monetization or analytics strategies
Benefits
- Competitive base salary
- Share Appreciation Rights program for salaried employees
- Paid vacation and sick leave
- 401K matching program
- Comprehensive health benefits with cost-sharing
- Hotel and travel discounts
- Charitable donation matching program
- Career growth support and development opportunities
- A values-driven, inclusive team culture committed to innovation and impact