clients Shuvel Digital

Data Engineer



Job Type






Skill Set


All Other Remote

Job Details

Basic Purpose:

Develop strategies for data acquisition, archive recovery, and database implementation. Responsible for designing, building, integrating data from various resources, and managing big data. Develop and write complex queries, while ensuring they are easily accessible, and work smoothly, with the goal of optimizing the performance of Navy Federal's big data ecosystem. Recognized as an expert with a specialized depth and/or breadth of expertise in the discipline. Solves highly complex problems; takes a broad perspective to identify solutions. Leads functional teams or projects. Works independently.


  • Provide Business Intelligence (BI) and Data Warehousing (DW) solutions and support by leveraging project standards and leading analytics platforms
  • Evaluate and define functional requirements for BI and DW solutions
  • Define and build data integration processes to be used across the organization
  • Build conceptual and logical data models for stakeholders and management
  • Work directly with business leaders to understand data requirements; propose and develop solutions that enable effective decision-making and drive business objectives
  • Prepare advanced project implementation plans which highlight major milestones and deliverables, leveraging standard methods and work planning tools
  • Analyze and interpret collected data; spot trends; write reports and recommendations for internal or external clients
  • Document existing and new processes to develop and maintain technical and non-technical reference materials.
  • Recognize potential issues and risks during the analytics project implementation and suggest mitigation strategies
  • Coach and mentor project team members in carrying out analytics project implementation activities
  • Interpret data presented in models, charts, and tables and transform it into a format that is useful to the business and promotes effective decision making
  • Communicate and own the process of manipulating and merging large datasets
  • Expert and key point of contact between the data analyst/data scientist and the project/functional analytics leads
  • Perform other duties as assigned

Qualifications and Education Requirements:

  • Master's degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experience
  • Expert skill in Microsoft Excel and Operation Data Integration (OLTP)
  • Advanced skill in Microsoft Azure, Informatica, MS Office Products, SQL
  • Proficient in Microsoft SharePoint, Azure Data Catalog, Microsoft Power BI, Data Architecture, Apache Spark
  • Novice skill level in Python, Agile Frameworks (SAFE), Microsoft Databricks, Snowflake, Azure Data Factory, Hive (Apache), Cognos
  • Knowledge of and the ability to perform basic statistical analysis such as measures of central tendency, normal distribution, variance, standard deviation, basic tests, correlation, and regression techniques
  • Experienced in sourcing, maintaining, and updating data
  • Optimal understanding of SQL
  • Ability to understand the business problem and determine what aspects of it require optimization; articulate those aspects in a clear and concise manner
  • Experienced in the use of ETL tools and techniques
  • Ability to understand other projects or functional areas in order to consolidate analytical needs and processes
  • Demonstrates change management and/or excellent communication skills
  • Understands data warehousing, data cleaning, data pipelines and other analytical techniques required for data usage
  • Demonstrates deep understanding of multiple data-related concepts
  • Working knowledge of various data structures and the ability to extract data from various data sources (such as Cognos, Informatica)
  • Understands the concepts and application of data mapping and building requirements
  • Understands data models, large datasets, business/technical requirements, BI tools, data warehousing, statistical programming languages and libraries
  • Demonstrates functional knowledge of data visualization tools such as Microsoft Power BI, Tableau
  • Working knowledge of various data structures and the ability to manipulate data within visualization tools
  • Experience using GIT
  • Skilled in managing the process between updating and maintaining data source systems and implementing data-related requirements