This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Experience to design, develop, and deploy machine learning models and AI-driven solutions. Have a strong background in machine learning algorithms, data science, and software engineering, with expertise in cloud-based AI services.
Job Responsibility:
Experience in developing AI-powered applications in real-world scenarios
Design, develop, and optimize machine learning models (LLMs) for automation and predictive analytics
Implement deep learning models using TensorFlow, PyTorch, or other frameworks
Develop and integrate AI-driven solutions into existing software applications
Fine-tune and optimize models for performance and scalability
Deploy models using cloud platforms such as AWS, GCP, Azure or any other cloud
Monitor and improve model accuracy and efficiency through continuous iteration
Preprocess, analyze, and visualize large datasets to extract meaningful insights
Collaborate with cross-functional teams, including software engineers and data scientists
Work with data engineers to ensure efficient data pipelines for AI applications
Requirements:
Experience: 3 to 5 Years in AI/ML development and deployment
Qualification: B.E in Computer Science or any degree with Artificial Intelligence & Data Science
Strong programming skills in Python
Experience with ML frameworks such as TensorFlow, PyTorch, Scikit-Learn, or Keras
Experience with NLP, Computer vision, or reinforcement learning
Proficiency in data processing libraries like Pandas and NumPy
Hands-on experience with cloud-based AI/ML services (AWS SageMaker, Google Vertex AI, Azure ML, etc.)
Understanding of MLOps principles, model deployment, and monitoring
Strong problem-solving skills and ability to work in an agile development environment
Excellent communication and teamwork abilities
Nice to have:
Familiarity with big data technologies like Hadoop, Spark, or Kafka
Knowledge of containerization and orchestration tools like Docker and Kubernete