AI Data Engineer

Job not on LinkedIn

October 23

Apply Now
Logo of Pakistan Single Window (PSW)

Pakistan Single Window (PSW)

B2B • Compliance • Government

Pakistan Single Window (PSW) is an integrated digital platform designed to streamline cross-border trade in Pakistan. It enables parties involved in trade to submit standardized information and documents through a single-entry point to meet all import, export, and transit-related regulatory requirements. PSW aims to reduce the time and cost of doing business by digitalizing Pakistan’s cross-border trade processes and eliminating paper-based manual procedures. It integrates commercial banks, customs, and other government agencies on a unified digital platform to facilitate financial transactions, regulatory compliance, and the electronic processing of licenses, permits, and certificates. Its Tradeverse portal offers a single access point for trade information related to imports, exports, and transit, enhancing the efficiency and transparency of trade operations in Pakistan.

51 - 200 employees

Founded 2021

🤝 B2B

📋 Compliance

🏛️ Government

📋 Description

• Build and optimize ETL pipelines for large-scale AI applications, integrating APIs, web scraping, and real-time data processing. • Develop and maintain scalable AI data infrastructure using PySpark, Pandas, SQL, and cloud services (Azure, AWS, GCP). • Implement retrieval-augmented generation (RAG) pipelines using FAISS, ChromaDB, and Pinecone for AI-driven insights. • Deploy and monitor AI applications using FastAPI, Streamlit, Docker, for real-time performance. • Work with cross-functional teams to ensure data security, compliance, and AI model reliability in production environments.

🎯 Requirements

• Bachelor's or Master's degree in Computer Science, Data Engineering, Artificial Intelligence, or a related field. • Atleast 2-4 years of experience in data engineering, AI/ML, or cloud-based AI infrastructure. • Expertise in Python, PySpark, and SQL for data transformation and large-scale processing. • Experience with cloud platforms (AWS, Azure, GCP) for AI model deployment and data pipeline automation. • Hands-on experience with vector databases (FAISS, ChromaDB, Pinecone) for efficient data retrieval. • Proficiency in containerization and orchestration tools like Docker and Kubernetes. • Strong understanding of retrieval-augmented generation (RAG) and real-time AI model deployment. • Knowledge of API development and AI service integration using FastAPI and Streamlit/Dash. • Ability to optimize AI-driven automation processes and ensure model efficiency. • Strong analytical and problem-solving skills. • Excellent communication and collaboration abilities.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com