Job Description
Context:
A global organization is seeking to improve the operability and performance of its data services, which are built on several IT platforms: a data lake, data mart, data warehouse, and BI & Data Science environments.
While these platforms are mature and already in production, there is a need for tooling to address orchestration challenges and ensure operational performance and coordination, aligned with the service levels expected by business stakeholders.
Mission:
In collaboration with the Data Architect, data experts, and the product owner/manager, you will support operations and contribute to implementing the technical roadmap of the orchestration platform based on Airflow.
You will be responsible for developing and providing expert support for both the platform’s core features and the DevOps tooling around it—especially observability, performance management, and continuous improvement of the orchestration service.
Location : Beirut, Lebanon
Experience:
- 5+ years with Airflow
- OR 3+ years as a Data Engineer with AWS + Python
Key Activities:
- Contribute to the Airflow product development lifecycle
- Implement and develop platform features and tools in agreement with the Data Architect and experts
- Develop DAGs (Directed Acyclic Graphs) to support operability of data lake and data mart platforms (AWS and Snowflake)
- Provide technical support to end-users of the platform
- Ensure code delivery and deployment
- Operate and monitor the platform (“run” mode)
- Provide Level 2 production support and participate in crisis management
- Contribute to the product’s Knowledge Base
- Report platform performance, real-time health, and key risks to the Airflow platform manager
- Participate in the Data Orchestration community
- Act as a key user for teams using the platform: BI, business projects, Data Science, DataLake, etc.
- Join regular meetings with users to share development and usage best practices
- Transfer knowledge to offshore support teams
- Produce documentation: tutorials, operations manuals, etc.
- Lead product demo sessions and knowledge transfer
- Support initial developments by the support team
- Assist support teams in resolving incidents/issues
Deliverables:
- Coding and production deployment of DAGs
- Unit test reports for implemented DAGs
- Contributions to platform improvement proposals
- Delivery planning / roadmap via JIRA
- Development best practices documented in Confluence
- Incident and request ticket updates in JIRA
- Support knowledge base (ServiceNow and Confluence)
- Committee presentation materials and meeting minutes
Required Skills:
- AWS Cloud/Data Engineering
- Airflow process orchestration
- ETL/ELT principles: dbt
- Agile (Scrum/Kanban) methodologies
- Knowledge of data management ecosystem (Data Engineer profile)
- DevOps fundamentals
- IT systems observability fundamentals
Soft Skills:
- Proactive
- Independent
- Energetic
- Team-oriented / Communicative (a plus)
Tool Proficiency:
- Airflow (required)
- AWS
- Python
- Snowflake
- dbt
- JIRA / Confluence
- Jenkins
- GitLab
- Terraform
- Kubernetes / Containers
- Elasticsearch / Kibana (a plus)