Data Engineer

Tecnología · Remote

Apply for this Job

Description

Duties and Responsibilities:

  • At least two end to end Snowflake project experience, prior experience with providing development and production support
  • Knowledge of SQL language and cloud-based technologies, experience with developing and maintaining ETL processes to move data from source systems to Snowflake
  • Understanding of Data warehousing concepts, data modeling, metadata management, data lakes, multi-dimensional models, data dictionaries
  • Experience with migration to AWS or Azure Snowflake platform
  • Performance tuning and setting up resource monitors
  • Snowflake modeling – roles, databases, schemas
  • SQL performance measuring, query tuning, and database tuning
  • Integration with third-party tools, experience with coding in languages like Python, Java, JavaScript
  • Root cause analysis of models with solutions
  • Exposure to Hadoop, Spark, and other data warehousing tools
  • Knowledge of managing sets of XML, JSON, and CSV from disparate sources
  • Data ingestion into Snowflake
  • Monitor and optimize query performance
  • Troubleshoot and debug data issues
  • Experience working in an agile environment
  • Trouble shooting and problem solving
  • Good communication skills 
  • Must work within the designated hours.


Requirements

 Required and Desired Skills/Certifications: 


End to end Snowflake project experience, prior experience with providing development and production support

Knowledge of SQL language and cloud-based technologies, experience with developing and maintaining ETL processes to move data from source systems to Snowflake

Understanding of Data warehousing concepts, data modeling, metadata management, data lakes, multi-dimensional models, data dictionaries

Experience with migration to AWS or Azure Snowflake platform

Performance tuning and setting up resource monitors

Snowflake modeling – roles, databases, schemas

SQL performance measuring, query tuning, and database tuning

Integration with third-party tools, experience with coding in languages like Python, Java, JavaScript

Exposure to Hadoop, Spark, and other data warehousing tools

Knowledge of managing sets of XML, JSON, and CSV from disparate sources

Data ingestion into Snowflake