Chief Data Officer
New York City, NY, USA | Global Atlantic Financial Group
Industry:Insurance - General
IT / Information Technology
Job Description:95 people have viewed this job
The CDO is the senior executive who bears responsibility for developing and governing our data and information strategy to drive business decisions and growth. The role of the CDO is to build and develop the enterprise data ecosystem, data procedures and policies, and partner closely with the rest of the organization to provide, prepare, organize and analyze data assets while ensuring that the company meets industry best practices. The CDO will lead inter-disciplinary teams to improve and streamline data systems within the company, and driving innovation. The CDO provides the vision for business-wide data activities and champions data ownerships, standardization, accessibility, and governance as follows.
Drive the strategy, development and deployment of the enterprise's data hub platform as well as a roadmap for its future development that matches and supports business needs.
Define, manage and ensure an adequate information trust model, controls for master data and metadata management, including reference data.
Drive Automation through effective metadata management: responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity
Define, manage and advance enterprise information management principles, policies and programs for stewardship and custodianship of data
Identify and standardize the use and governance of data.Ensure that the Data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives..
Collaborate across departments: Strong collaboration skills in order to work with varied stakeholders within the organization in refining their data requirements for various initiatives and data consumption requirements.Responsible for proposing appropriate (and innovative) data ingestion, preparation, integration and operationalization techniques in optimally addressing these data requirements.
Be the corporate leader of data-driven insights that help support exploitation of strategic and tactical business opportunities, and be a champion for a data-driven, decision-making culture
Develop and maintain controls on data quality, interoperability and sources to effectively manage corporate risk, financial reporting and regulatory requirements associated with the use of data.
A minimum of 10 years of experience in IT.At least 3 years or more of work experience in data management disciplines including data integration, modeling, optimization and data quality
In-depth experience of designing and implementing information solutions.
System integration experience, including interface design
Experience with distributed management and analytics in cloud and hybrid environments. Also an understanding of a variety of data access and analytics approaches (for example, microservices and event-based architectures).
Strong communication and persuasion skills, including the ability to create messaging materials that meet stakeholder needs.
Ability to analyze project, program and portfolio needs, as well as determine the resources needed to achieve objectives and overcome cross-functional barriers.
Business acumen and the ability to communicate to executives, business domain stakeholders and technical staff alike.Ability to communicate, influence and persuade peers and leadership.
Experience in working with SQL on Hadoop tools and technologies including [HIVE, Impala, Presto, others] from an open source perspective and [Hortonworks Data Flow (HDF), Dremio, Informatica, Talend, others] from a commercial vendor perspective.
Experience in working with both open-source and commercial message queuing technologies [such as Kafka, JMS, Azure Service Bus, Amazon Simple queuing Service, others], stream data integration technologies such as [Apache Nifi, Apache Beam, Apache Kafka Streams, Amazon Kinesis, others] and stream analytics technologies such as [Apache Kafka KSQL Apache Spark Streaming Apache Samza, others].
Experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include ETL/ELT, data replication/CDC, message-oriented data movement, API design and access and upcoming data ingestion and integration technologies such as [stream data integration, CEP and data virtualization].
Hands-on experience with implementing data management programs is preferred
Already a member? Sign In