Senior Staff Data Engineer
Career Renew, United States

Experience
1 Year
Salary
0 - 0
Job Type
Job Shift
Job Category
Traveling
No
Career Level
Telecommute
No
Qualification
As mentioned in job details
Total Vacancies
1 Job
Posted on
Nov 22, 2023
Last Date
Dec 22, 2023
Location(s)

Job Description

Career Renew is recruiting for one of its clients, a public cloud IT transformation services company, a Senior Staff Data Engineer in Austin, Texas - this is an onsite position.

As a Senior Staff Data Engineer, Corporate, you will have the opportunity to work with big data and emerging Google Cloud technologies to drive corporate services. You will have an opportunity to design, develop, and maintain the best Enterprise Data Warehouse solution to fit our corporate needs. You will be interacting with all of our business units and Google Cloud subject matter experts.

From transforming business requirements, solution architecture, data modeling, architecting, ETL, metadata, and business continuity, you will have the opportunity to work collaboratively with architects and other engineers to recommend, prototype, build, and debug data infrastructures on Google Cloud Platform (GCP). You will have an opportunity to work on real-world data problems facing our customers today. Engagements vary from being purely consultative to requiring heavy hands-on work and covering a diverse array of domain areas, such as data migrations, data archival and disaster recovery, and big data analytics solutions requiring batch or streaming data pipelines, data lakes, and data warehouses.

You will be expected to run point on whole projects, end-to-end, and to mentor less experienced Data Engineers. You will be recognized as an expert within the team and will build a reputation with Google and our customers. You will demonstrate repeated delivery of project architectures and critical components that other engineers demur to you for lack of expertise. You will also participate in early-stage opportunity qualification calls, as well as guide client-facing technical discussions for established projects.

Requirements

Mastery in the following domain area:

Data warehouse modernization: building complete data warehouse solutions on BigQuery, including technical architectures, star/snowflake schema designs, query optimization, ETL/ELT pipelines, and reporting/analytic tools. Must have expert-level experience working with Google's batch or streaming data processing solutions (such as BigQuery, Dataform, and BI Engine)

Proficiency in the following domain areas:

Big Data: managing Hadoop clusters (all included services), troubleshooting cluster operation issues, migrating Hadoop workloads, architecting solutions on Hadoop, experience with NoSQL data stores like Cassandra and HBase, building batch/streaming ETL pipelines with frameworks such as Spark, Spark Streaming, and Apache Beam, and working with messaging systems like Pub/Sub, Kafka and RabbitMQ.

Data Catalog: Managing Data Catalogs, definitions, and data lineage.

Data Quality: Must have experience with DataForm, or other DQ solutions.

Data migration: migrating data stores to reliable and scalable cloud-based stores, including strategies for minimizing downtime. It may involve conversion between relational and NoSQL data stores, or vice versa

Backup, restore amp; disaster recovery: building production-grade data backup and restore, and disaster recovery solutions. Up to petabytes in scale

4+ years of experience with Data modeling, SQL, ETL, Data Warehousing, and Data Lakes

4+ years experience in writing production-grade data solutions (relational and NoSQL)in an enterprise-class RDBMS

2+ years of experience with enterprise-class Business Intelligence tools such as Looker, PowerBI, Tableau, etc.

Mastery in writing software in Python

Experience writing software in one or more languages, such as Javascript, Java, R, or Go

Experience with systems monitoring/alerting, capacity planning, and performance tuning

Hands-on experience building frontend applications with React

Hands-on experience with CI/CD solutions (Cloud Build / Terraform)

Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery, Dataprep, Composer, etc.)

Experience with IoT architectures and building real-time data streaming pipelines

Experience operationalizing machine learning models on large datasets

Demonstrated leadership and self-direction -- a willingness to teach others and learn new techniques

Demonstrated skills in selecting the right statistical tools given a data analysis problem

Ability to balance and prioritize multiple conflicting requirements with great attention to detail

Excellent verbal/written communication amp; data presentation skills, including the ability to succinctly summarize key findings and effectively communicate with both business and technical teams

Benefits

Unlimited PTO, paid parental leave, competitive and attractive compensation, performance-based bonuses, paid holidays, generous medical, dental, vision plans, lif

Job Specification

Job Rewards and Benefits

Career Renew

Information Technology and Services - Cape Town, South Africa
© Copyright 2004-2024 Mustakbil.com All Right Reserved.