24 Hour Fitness, INC. Data Integration Engineer in Carlsbad, California
LOCATION 1265 Laurel Tree Lane Suite 200 Carlsbad CA 92011
The Data Integration Engineer works in the Data & Analytics department under the Data Delivery manager. It is responsible for the design and development of the data flow and integration process with a diverse portfolio of data sources (internal applications, SaaS, third party vendors) and data architectures (Cloud to On-Prem, On-Prem to Cloud, Cloud to Cloud and other hybrid models). This position is a highly technical focusing on Big Data technologies and platforms streamline data streaming through ingestion, storage, data processing. The ideal candidate will be a data specialist responsible for driving innovation and process improvements using modern data integration solutions and technologies. The position will work closely with IT leaders, Architects, Product Owners, software developers, system analysts, business subject matter experts, and technology partners to plan, prototype, and build innovative data delivery solutions.
ESSENTIAL DUTIES & RESPONSIBILTIES Estimated % of Time Spent
Drive development and implementation of our Big Data initiatives.
Develop integration processes for data collection (pub/sub) and connection, reporting modeling and mining of large scale applications (systems of engagement, automation and transactions).
Conduct research on new technologies to evolve our data lake to support our digital transformation in engaging with our customers and team members.
Knowledge of how to assess the performance of data solutions, how to diagnose data problems, and tools used to monitor and tune performance.
Design, construct, install, test and maintain highly scalable data management systems
Collaborate with application architects, developers, system and business analysts
Integrate new data management technologies and software engineering tools into existing structures
Recommend and implement ways to improve data reliability, efficiency and quality
Proactively understand and anticipate the company's information needs and identify new opportunities to improve our information consumption. 30%
Data sourcing, modeling and cataloging.
Design changes, provide solutions and manage additions to the Enterprise Data Warehouse & Data Marts working in conjunction with data architect and reporting team.
Identify data requirements by meeting with Data Architecture and Governance team and align with their priorities. Identify system of record and best source for different data needs by working with IT and owners of the application. Obtain and document data dictionaries from source systems. Understand and document data elements and their business purpose from business and reporting teams; Become an expert on source system back-end data and its location, structure and purpose. Get involve in initial implementation of new IT applications and identify dependencies and impacts to Data Lake, EDW and Data Marts. Become knowledgeable on department applications, programming, and operations; evaluate existing systems and design proposed solutions.
Define technologies and practices to build a foundational and ongoing process for data cataloging capture (Source System, Data Lake, Data Warehouse, Data Marts, etc.) to properly organize, integrate and curate data for business use. Keep it up-to-date as system changes with comprehensive definitions. 30%
Data Application Support
Ensure developed solutions have the robustness, reliability, performance and scalability necessary to support the growth and other needs of the business.
Promote and employ technical analysis and problem-solving skills within the development team in identification of the root causes of business performance and execution issues.
Apply and define best practices to the Data Development and Applications team through the introduction of repeatable, but agile processes to ensure root cause analysis and corrective action plans are performed, executed, managed and measured effectively
Ensure proper change control and configuration management is exercised throughout the data integration development lifecycle 15%
Documentation and Standards
Adhere to Information Technology Information Management development standards and support the team in following same standards.
Own data flow diagram documentation and understand the whole data integration flow.
Be an expert and go to person for data integration processes, transformations and high level rules of the data. Become very knowledgeable on the data model for different levels of data repositories. Collaborate in the documentation of the same.
Leverage personal technical experience and ideas for best tools and practices around data integration lifecycle, data cataloging, data flow design tools and data modeling tools.
Commit to project quality and on-time delivery.
Enforce the documentation standards across all team members and developers.
Document mappings and processes to share amongst team members. 15%
Collaboration and Organizational Relationships
Work closely with Data Delivery manager to understand priorities and project timelines.
Work with internal Data Integration developers and leader to ensure implementation is as per the specs by regularly have code reviews and demos.
Work with the data architect and reporting team to understand data requirements and use of data to understand and collaborate in data modeling activities. Work with Data Scientist, Reporting Governance Manager and Consumer Insights Manager to validate, confirm and test data integration solutions.
Regularly work with Database Administrators, Application Operations team, IT Information Systems, IT CS, IT Security, IT Dev Ops to ensure enterprise implementation standards, reviews, release process, security requests, configuration changes are aligned to IT to stream line the implementation.
Work with internal team members and manager to estimate efforts and plan activities. 10%
This role partners with Data Architecture and Governance, Data Science, Advanced Analytics, Marketing, IT, and reports to the Manager of Data Development & Applications within the Data & Analytics organization.
Knowledge, Skills & Abilities
Good knowledge on Big Data platforms and applications
Knowledge of cloud and hybrid solutions (Azure, AWS, SaaS data integration and data repository solutions)
Ability to design diagrams, data flows on the fly on a whiteboard and migrate into proper tool
Ability to capture requirements during meetings. Be able to led technical meetings and discussion with IT teams and communicate meeting notes.
Experience supporting production systems 24/7 365 days a year.
Availability and on-call to support batch, micro-batch and other data processes in production.
Development experience with Data Lake technologies for stream ingestion, storage and processing, such as Snowflake, Azure Data Warehouse, Kafka, Flume, Spark Streaming, Apache Storm, AWS S3, Azure, Hadoop, Spark, Base, MongoDB, Hive or Cassandra
Experience with Data Integration implementation with Informatica products (PowerCenter, EDC, Axon, BDM) or others like IBM InfoSphere, Jupyter Notebook, ML Flow, etc.
Expert knowledge of SQL and ability to implement advanced queries to extract information from very large datasets and diverse DBs (SQL Server, MySQL, Oracle, Snowflake, Azure, etc.)
Experience implementing software and/or solutions in the enterprise Linux or Unix environment
Strong understanding of various security requirements around integration such as exchanging certificates, using tokens/keys or simply password based.
Strong understanding of network configuration, devices, protocols, speeds and optimizations.
Strong understanding on mapping and identifying the appropriate level of trust for different data elements aligned to analytics and data governance initiatives.
Knowledge and experience in implementing BI solutions (MicroStrategy, PowerBI, Tableu, Qlik)
Familiarity with scripting tools such as bash shell scripts, Python
Proven characteristics of intellectual curiosity, leadership, responsibility, determination, creativity, flexibility, drive and self-confidence.
Excellent written, oral communication, project management and presentation skills. Ability to present complex data reports, financial analyses, and statistics in a simple and clear way.
Demonstrated ability in managing details and delivering on timelines.
Strong problem-solving skills.
Experience working in Agile teams in an agile framework and following Scrum methodology
Minimum Educational Level/Certifications
- 4 year Bachelor's degree in Engineering, Business Analytics, Computer Sciences or related field.
Minimum Work Experience and Qualifications
4+ years of experience in implementing Data Warehouse, Data Marts, Data Lakes and Big Data data solutions
6+ years of hands-on experience in implementing data integration solutions
3+ years of experience in design, architect and document data flows, data dictionaries and data architecting.
4+ years of experience in advance SQL for data analysis and extraction with multiple DBs (Oracle, SQL Server, MySQL, Snowflake)
FUNCTIONAL GROUP Corporate Operations