This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
BioCatch is the leader in Behavioral Biometrics, a technology that leverages machine learning to analyze an online user’s physical and cognitive digital behavior to protect individuals online. BioCatch’s mission is to unlock the power of behavior and deliver actionable insights to create a digital world where identity, trust, and ease coexist.Today, 32 of the world's largest 100 banks and 210 total financial institutions rely on BioCatch Connect™ to combat fraud, facilitate digital transformation, and grow customer relationships.. BioCatch’s Client Innovation Board, an industry-led initiative including American Express, Barclays, Citi Ventures, and National Australia Bank, helps BioCatch to identify creative and cutting-edge ways to leverage the unique attributes of behavior for fraud prevention. With over a decade of analyzing data, more than 80 registered patents, and unparalleled experience, BioCatch continues to innovate to solve tomorrow’s problems.
Job Responsibility:
Data Architecture Direction: Provide strategic direction for our data architecture, selecting the appropriate componments for various tasks. Collaborate on requirements and make final decisions on system design and implementation
Project Management: Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance
Cost Optimization: Monitor and optimize cloud costs associated with data infrastructure and processes
Efficiency and Reliability: Design and build monitoring tools to ensure the efficiency, reliability, and performance of data processes and systems
DevOps Integration: Implement and manage DevOps practices to streamline development and operations, focusing on infrastructure automation, continuous integration/continuous deployment (CI/CD) pipelines, containerization, orchestration, and infrastructure as code. Ensure scalable, reliable, and efficient deployment processes
Requirements:
5+ Years of Experience: Demonstrated experience as a DevOps professional, with a strong focus on big data environments, or Data Engineer with strong DevOps skills
Data Components Management: Experiences managing and designing data infrastructure, such as Snowflake, PostgreSQL, Kafka, Aerospike, and Object Store
DevOps Expertise: Proven experience creating, establishing, and managing big data tools, including automation tasks. Extensive knowledge of DevOps concepts and tools, including Docker, Kubernetes, Terraform, ArgoCD, Linux OS, Networking, Load Balancing, Nginx, etc.
Programming Skills: Proficiency in programming languages such as Python and Object-Oriented Programming (OOP), emphasizing big data processing (like PySpark). Experience with scripting languages like Bash and Shell for automation tasks
Cloud Platforms: Hands-on experience with major cloud providers such as Azure, Google Cloud, or AWS
Nice to have:
Performance Optimization: Experience in optimizing performance for big data tools and pipelines - Big Advantage
Security Expertise: Experience in identifying and addressing security vulnerabilities within the data platform - Big Advantage
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.