Big Data Solution Architect- KPMG, Dubai
Responsibilities:
Facilitate the establishment and execution of the roadmap and vision for information delivery and management, including the enterprise data warehouse, big data, BI & analytics, content management and data management
Propose and endorse the target data architecture and ensure its implementation and upkeep
Play a critical role in tools selection, implementation partner selection and on-boarding of key technical skills to ensure success of technical sub-initiatives including data quality and Master data management, ODS and DWH development, Data Marts, BI and Advanced analytics and legacy data migration to EDW
Lead Business Analysts, Data Engineers and Data Scientists to architect and deliver the solutions that fulfill the Business information needs and align with the information vision and strategy
Apply business strategy while driving technology strategy, balancing short term and long term needs to ensure that the architecture can scale and evolve accordingly
Engage an end-to-end approach by connecting all the pieces of data to deliver the data while leveraging all available assets
Provide advice, guidance and direction to carry out major plans and procedures to ensure schedule attainment, and performance and budget targets are met.
Qualifications:
Minimum 8 years of experience in enterprise data architecture, integration and analytics, with a proven track record of advising clients and delivering large projects in strategy, design and implementation of Data Warehouse, Big Data and Analytics solutions
Minimum 5 years leading a team including data engineers, data scientists and BI Developers.
Bachelor’s degree in computer science or relevant field from an accredited college or university; Master’s degree from an accredited college or university preferred
Strong technical experience in the following:
Data Warehousing, BI and Big data Stack and solutions, including cloud technologies.
Data architecture, including data model design and implementation (incl. relational, dimensional, No-SQL)
Data Integration design and implementation (e.g. ETL, messaging, replication and APIs)
Deep hands-on experience with the following tools and technologies is preferred: Cloudera Distribution Hadoop, HP Vertica (Data Warehouse), Talend/Spark (Data Integration), Qlik (Reporting), Virtualization (Denodo) and Machine Learning (Python).
Apply Here:
https://elzw.fa.em8.oraclecloud.com/hcmUI/CandidateExperience/en/sites/CX_1001/job/394
=======================================================================
Create a new CV to apply for jobs in the GCC:
1. Applicant Tracking System (ATS) - compliant CV. (This CV is used for online applications on job sites. ATS is the software that reads and stores your CV in a database).
2. Visual/ Infographic CV. (This CV is normally sent to recruiters by email. It is visually - appealing. It may not be compliant to the ATS).
To know more about ATS and Infographic CV read here:
https://www.dubai-forever.com/cv-writing-services.html#ATS-Compliant-CV
No comments:
Post a Comment