Senior Data Engineer - Vice President
Company: SMBC Group
Location: Charlotte
Posted on: March 27, 2026
|
|
|
Job Description:
SMBC Group is a top-tier global financial group. Headquartered
in Tokyo and with a 400-year history, SMBC Group offers a diverse
range of financial services, including banking, leasing,
securities, credit cards, and consumer finance. The Group has more
than 130 offices and 80,000 employees worldwide in nearly 40
countries. Sumitomo Mitsui Financial Group, Inc. (SMFG) is the
holding company of SMBC Group, which is one of the three largest
banking groups in Japan. SMFG’s shares trade on the Tokyo, Nagoya,
and New York (NYSE: SMFG) stock exchanges. In the Americas, SMBC
Group has a presence in the US, Canada, Mexico, Brazil, Chile,
Colombia, and Peru. Backed by the capital strength of SMBC Group
and the value of its relationships in Asia, the Group offers a
range of commercial and investment banking services to its
corporate, institutional, and municipal clients. It connects a
diverse client base to local markets and the organization’s
extensive global network. The Group’s operating companies in the
Americas include Sumitomo Mitsui Banking Corp. (SMBC), SMBC Nikko
Securities America, Inc., SMBC Capital Markets, Inc., SMBC
MANUBANK, JRI America, Inc., SMBC Leasing and Finance, Inc., Banco
Sumitomo Mitsui Brasileiro S.A., and Sumitomo Mitsui Finance and
Leasing Co., Ltd. Role Description The VP, Senior Data Engineer, is
a critical role within the Information Security department and will
report to the Director, Information Security Data Operations for
the SMBC Americas Division. This VP role will be responsible for
designing, developing & deploying cloud data solutions for ISDAD.
This is part of the overall cyber data initiative focusing on
building out the security and risk data platform for Information
Security. This individual will be responsible for developing the
data feeds and will be a part of the larger development effort of
building out a Cybersecurity Data Lakehouse (CyberDW). The goal of
the CyberDW is to centralize the ISDAD data as well as establish
effective data governance around the data sources and its data
lineage. This role will collaborate with the developers, data
owners, governance leads and business analysts within the
Information Technology (IT) department as well as other
stakeholders aligned with the applications owned by ISDAD. This
role will be aligned with the Continuous Controls Monitoring
program. KCIs (Key Control Indicators) and Data Quality (DQ) rules
will be created to continuously assess and report on the
effectiveness of ISDAD’s internal controls as part of the firm’s
GRC Risk Management and CDO Data Governance frameworks. Role
Objectives Design, develop, testing & support of cloud data
solutions for the CyberDW catalogue focusing on data ingestion,
data quality, data tuning and performance of upstream cybersecurity
data sources. Indirect leadership and mentorship of junior team
members. Able to lead data feed development efforts and design
initiatives. Participate in development meetings to align
development priorities and objectives, assign tasks, and share
experiences and challenges with applications under development.
Consult with other technology and development teams as needed to
coordinate on the integration of applications with the larger
company software ecosystem. Capture and document metadata for
identified Key Data Elements (KDEs) to ensure accuracy and
completeness for Data Quality rules and processing of daily
datasets. Work with the data architecture team to align KDEs to
logical data models, develop physical data structures, and document
physical data names, definitions, and data types. Partner with the
data owners and stakeholders to create technical requirements and
DQ rules around the data elements needed in the CyberDW. Partner
with the Business Management team and Data Owners to understand
what critical metrics and data fields are needed for Metric
Dashboards. Establish CyberDW views to encapsulate the data so that
it is fit for downstream consumption. Ensure that the data aligns
with DQ rules established on that metadata, so it is fit for daily
use. Utilize IBM TWS production job scheduling system and adhere to
standards around the daily scheduling and batch monitoring of
production jobs. Identify and resolve DQ issues including
inaccuracies and incomplete information. Enhance data quality
efforts by implementing improved procedures and processes.
Qualifications and Skills Bachelor’s degree in Computer Science,
Information Security, Data Management, or related field 10 years’
experience in IT development, data governance, data architecture or
related roles, preferably in a highly regulated environment such as
financial services. 5 years’ experience in Azure Databricks, Azure
Data Factory, Azure Functions, Azure Data Lake storage, Azure Event
Grid, Azure Log Analytics, Azure Monitor, Unity catalog repository
configuration Proficient in data management & data modeling tools
(e.g., Collibra DQIM/DQE, IBM Infosphere DA). Strong knowledge of
CI/CD and DevOps tooling (i.e., Git, Jenkins, Azure DevOps)
Proficient in SQL Server, Oracle / PL-SQL, T-SQL and SQL stored
procedures Proficient in Python, Java or similar high-level
server-side languages Strong knowledge of enterprise Information
Security data (i.e., Phishing, Identity Management, Privileged
Access, Cloud Security, Incident Response, Vulnerability
Management, Threat Detection). Data knowledge of PaaS/SaaS products
(i.e., ServiceNow, Crowdstrike, MS Purview, Proofpoint, WIZ.IO,
JIRA, SharePoint, Azure Active Directory, SAI360). Knowledge of
Microsoft Sentinel for security information and event management
(SIEM) is a plus. Understanding of information security frameworks
and security controls (i.e., NIST, CIS, CRI Profile) and regulatory
compliance (i.e., NYSDFS, GDPR, CCPA). Experience with REST API web
services and microservice architecture. Strong understanding of
ETL/ELT. Knowledge of IBM Tivoli Workload Scheduler a plus.
Exposure to PowerBI for data visualization and reporting is a plus.
Problem solving and analytical skills, with an initiative-taking
and results oriented approach. Additional Requirements SMBC’s
employees participate in a Hybrid workforce model that provides
employees with an opportunity to work from home, as well as, from
an SMBC office. SMBC requires that employees live within a
reasonable commuting distance of their office location. Prospective
candidates will learn more about their specific hybrid work
schedule during their interview process. Hybrid work may not be
permitted for certain roles, including, for example, certain
FINRA-registered roles for which in-office attendance for the
entire workweek is required. SMBC provides reasonable
accommodations during candidacy for applicants with disabilities
consistent with applicable federal, state, and local law.
Keywords: SMBC Group, Huntersville , Senior Data Engineer - Vice President, IT / Software / Systems , Charlotte, North Carolina