Lead Data and Integration Architect
(12 years of experience in Data Architecture, Integration and Data Lake)
Strong experience and a proven background in data modelling, taxonomies and ETL for Big Data, NoSQL and relational database systems
· Lead end-to-end vision on the Data Platform strategy, and to see how a logical design will translate into one or more physical databases, and how the data will flow through the successive stages involved. He or she will need to be able to address issues of data migration (validation, clean-up and mapping), and will need to understand the importance of data dictionaries
· The ability to lead others and communicate effectively with complete responsibility on (Data Analysis, Data Migration Tools Knowledge, Data Modelling, Data Integration, Data Warehousing, Database Design and Data security best practices)
· High Experience on Data Lake and Informatica development best practice and standards.. BigData Cloudera Certification is a plus.
· At least 5 Years of Experience with the definition and implementation of Enterprise Data Strategy and Data Management Principles for Banking Industry
· Expertise with data discovery and access, data analysis, data modelling and data virtualization.
· Extensive experience with various integration areas such as data integration, application integration and business process integration.
· Strong understanding of and experience with various capabilities like Data Governance, Data Content Management (reference data, master data, metadata, data life cycle, and data quality management), technologies and tools in data modelling, virtualization, profiling, quality management, performance and delivery.
· Minimum 4 years of Hands on Experience in Real-Time data process on Data Lake
· Minimum 4 years of experience in Informatica Development
· Minimum 2 years of Experience in EDWH Modelling and projects
· Strong Unix and Java Development skills at least for 4 years.
· Strong business analysis and problem solving skills.
· Understanding of multiple data architectures that can support a Data Warehouse.
· Experience in optimizing data loads and data transformations.
· Minimum 4 years of experience developing and implementing Informatica Powercenter and/or Informatica Big Data Management
· Big Data Experience in : Apache Spark, Kafka, Hive, Hadoop HDFS, OpenStack Swift, Parquet, ORC