This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Sybrant Technologies has been in the forefront of transforming its customers into full digital businesses. Though we are small, we grow at a rapid pace due to our capabilities in the contemporary technologies. Sybrant can deep dive in areas such as Mobility, IoT and Analytics in addition to traditional technologies. We can rapidly implement these solutions because of the Products, Frameworks and Partnerships that we have. In addition, our technically sound people and proven processes help in accelerating our customers’ adoption curves. Advantage Sybrant has always been in its nimbleness and delivering high quality yet cost effective solutions. That’s why we are the “Digital Transformation Power” behind our customers. We are a PreludeSys Group Company.
Job Responsibility:
Develop and maintain web scraping scripts using Python (Requests, BeautifulSoup, Selenium, Scrapy)
Automate extraction workflows to ensure reliable and repeatable data collection
Handle anti-scraping mechanisms such as CAPTCHAs, rotating proxies, headers, and session management
Clean, transform, and load extracted data into internal databases
Design and build REST APIs to expose processed data from the database
Optimize scraping workflows for performance, reliability, and error handling
Monitor scraping jobs, troubleshoot failures, and ensure data freshness
Maintain documentation for scraping logic, API endpoints, and workflows
Collaborate with product and data teams to understand evolving data requirements
Requirements:
Strong proficiency in Python
Hands-on experience with web scraping tools (Requests, BeautifulSoup, Selenium, Scrapy)
Good understanding of HTML, DOM structure, XPath, and CSS selectors
Experience building REST APIs using FastAPI, Flask, or Django
Solid knowledge of SQL and relational databases (MySQL / PostgreSQL)
Experience handling proxies, cookies, headers, rate limits, and sessions
Familiarity with Git and basic CI/CD workflows
Nice to have:
Understanding of ETL concepts and data engineering workflows
Exposure to Airflow or other workflow orchestrators
Basic understanding of data pipelines or ML pipelines