This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Architect the Future of Logistics: Data Engineer (Cloud & AI Focus) You will build the scalable data foundations that power one of Europe's largest logistics networks, directly impacting how global freight moves more efficiently and sustainably. About Us: Trimble is an industrial technology company transforming the way the world works by delivering solutions that enable our customers to thrive. We create technologies that connect the digital and physical worlds, helping our customers increase productivity, quality, safety, and sustainability. From purpose-built products to enterprise-level solutions, our technology empowers professionals in construction, geospatial, government, transportation, and more. In the Transportation & Logistics segment, our solutions make it safer, simpler and more efficient to move freight—bringing together a global network of shippers, carriers, brokers and 3PLs. What Makes This Role Unique: You will be at the forefront of migrating and scaling cloud-based data solutions for a market leader, leveraging AI tools like Cursor to solve complex technical challenges and influence the data strategy for a global transportation management system leader.
Job Responsibility:
Cooperate on design and implementation of high-performance, scalable data solutions, including data lakes and streaming systems that handle massive global logistics traffic.
Architect robust ETL/ELT pipelines and event-driven ingestion processes to ensure seamless data flow across PostgreSQL, Redshift, and DynamoDB environments.
Optimize cloud resources for peak performance and cost-efficiency, ensuring our data infrastructure is lean and powerful.
Collaborate with cross-functional teams to translate business requirements into comprehensive, cloud-based solutions that drive real-world impact.
Support data governance efforts, establishing policies that guarantee the highest standards of data quality, security, and compliance.
Requirements:
Cloud Data Expertise: Proven first experience (2 years) with data warehouses and lakes within a cloud ecosystem (Glue, Lambda, Step Functions).
Technical Scripting: Advanced proficiency in Python and SQL for complex data engineering and automation.
Infrastructure & DevOps: Strong skills in Infrastructure as Code (Terraform or CloudFormation) and CI/CD workflows.
Modern Data Stack: Familiarity with containerization (Docker/Kubernetes) and managing diverse database environments.
Nice to have:
Hands-on experience with data visualization tools like QuickSight.
Proficiency in AI-assisted development tools and a drive to adopt new technologies.