Data Engineering Specialist
IMPACT Initiatives (HQ)
نشرت في 15 ديسبمر
أرسل لي وظائف مثل هذه
الجنسية
أي جنسية
جنس
غير مذكور
عدد الشواغر
1 عدد الشواغر
الوصف الوظيفي
الأدوار والمسؤوليات
RESPONSIBILITIES
CO-LEAD IMPACT'S DIGITAL DATA TRANSFORMATION INITIATIVES
- Participate in the design and implementation of the organization s digital data transformation, identifying opportunities for automation, improvement, and innovation.
DESIGN, IMPLEMENT AND MAINTAIN SCALABLE DATA PIPELINES AND DATA STORAGE INFRASTRUCTURE
- Build and maintain a secure central system (data warehouse/data lake) for IMPACT s data. Automate pipelines from raw data ingestion to cleaned and harmonized datasets. Enable improved access to data for users through scalable interfaces and facilitate both internal sharing across country teams and controlled external dissemination
ENSURE HIGH-QUALITY, RELIABLE AND REUSABLE DATA PIPELINES
- Build tools, pipelines, and processes that ensure data quality standards at scale and support semi-automated workflows from data collection to outputs.
OPERATIONALIZE, MONITOR AND OPTIMIZE DATA PROCESSES
- Collaborate with country teams and stakeholders to ensure smooth implementation, monitor performance, maintain pipelines, troubleshoot issues, and conduct regular reviews of workflows to identify bottlenecks, recommend optimizations, and propose improvements based on empirical data analysis
DEFINE AND IMPLEMENT DATA STANDARDS AND BEST PRACTICES ACROSS MISSIONS
- Participate in establishing and promoting consistent data standards for data quality, coding practices, and data collection processes. Support country teams in applying these practices to ensure consistency and reliability of data workflows across all missions
ADDTIONAL TASKS (ON AN AD-HOC, BY NEED BASIS)
- Provide ad-hoc technical support, guidance, and troubleshooting to global and country teams (up to 25 30% of monthly time)
REQUIREMENTS
QUALIFICATIONS & EXPERIENCE
- Technical Knowledge & Skills
- Proven ability to design, implement, maintain, and optimize centralized data storage solutions and scalable data processing workflows
- Strong passion for data engineering and data science challenges, with a focus on building efficient, reliable, and maintainable solutions
- Advanced working knowledge of Python, R and SQL
- Solid experience with relational databases and ETL
- Experience implementing and maintaining CI/CD pipelines and applying coding best practices (unit testing, logging, version control, ...)
- Soft skills
- Strong analytical and problem-solving skills with attention to detail and commitment to producing high quality work
- High level of autonomy and initiative, able to independently drive tasks and proactively propose innovative technical solutions.
- Ability to communicate clearly and effectively with team members, including when facing technical challenges.
- Openness to feedback, continuous learning, and knowledge sharing.
- Adaptability and flexibility to thrive in a fast-paced, results-oriented environment, including remote and cross-country collaboration.
- Fluency in English
DESIRED
- Experience with cloud computing platforms and services, preferably Azure,
- Experience in Python-based data engineering workflows, including libraries and frameworks such as data load tool (dlt), SQLModel, SQLMesh, dbt, Airflow or similar tools for building, automating, orchestrating, and monitoring data pipelines
- Experience with KoboToolbox, focusing on interaction via the API and understanding the various data structures stored on the server and accessible through the API.
- Knowledge or experience in developing data-oriented web interfaces or dashboards
- Experience defining, promoting, and supporting adoption of technical standards and best practices across teams and country missions
- Awareness of emerging technologies and trends in data engineering and data science, and capacity to assess their applicability to organizational needs
- Good understanding of statistics and standard machine learning algorithms (supervised and unsupervised)
- Fluency in French and/or Spanish
- Familiarity with the humanitarian aid system
الملف الشخصي المطلوب للمرشحين
QUALIFICATIONS & EXPERIENCE
- Master s degree or higher in a relevant discipline (e.g., Computer Science, Data Science, Statistics, Econometrics, or related fields), or equivalent knowledge acquired through self-learning and/or previous professional experience in data engineering or data science
- At least 5 years of relevant work experience, preferably in the humanitarian or development sector
القطاع المهني للشركة
- منظمات غير حكومية
- خدمات اجتماعية
- منظمات غير ربحية
المجال الوظيفي / القسم
- سوفت وير تقنية المعلومات
الكلمات الرئيسية
- Data Engineering Specialist
تنويه: نوكري غلف هو مجرد منصة لجمع الباحثين عن عمل وأصحاب العمل معا. وينصح المتقدمون بالبحث في حسن نية صاحب العمل المحتمل بشكل مستقل. نحن لا نؤيد أي طلبات لدفع الأموال وننصح بشدة ضد تبادل المعلومات الشخصية أو المصرفية ذات الصلة. نوصي أيضا زيارة نصائح أمنية للمزيد من المعلومات. إذا كنت تشك في أي احتيال أو سوء تصرف ، راسلنا عبر البريد الإلكتروني abuse@naukrigulf.com
IMPACT Initiatives (HQ)