Brooksource

AWS Data Engineer

Click Here to Apply

Job Location

Charlotte, NC, United States

Job Description

Location: Charlotte, NC (hybrid, 2-3 days/week in office)

3- year contract with opportunity for extension or full-time hire

W-2 only (cannot accommodate Corp-to-Corp or 1099)



Brooksource is searching for an AWS Data Engineer with expertise in data warehousing using AWS Redshift to join our Fortune 500 Energy & Utilities client in Charlotte, NC


.
RESPONSIBILITIE


  • S:
    Provides technical direction, guides the team on key technical aspects and responsible for product tech deliv
  • eryLead the Design, Build, Test and Deployment of compone
  • ntsWhere applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Le
  • ad)Understand requirements / use case to outline technical scope and lead delivery of technical solut
  • ionConfirm required developers and skillsets specific to prod
  • uctProvides leadership, direction, peer review and accountability to developers on the product (key responsibili
  • ty)Works closely with the Product Owner to align on delivery goals and tim
  • ingAssists Product Owner with prioritizing and managing team back
  • logCollaborates with Data and Solution architects on key technical decisi
  • onsThe architecture and design to deliver the requirements and functional
  • itySkilled in developing data pipelines, focusing on long-term reliability and maintaining high data qual
  • ityDesigns data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performa
  • nceManage and resolve issues in production data warehouse environments on


AWS
REQUIRED SKI

  • LLS:5+ years of AWS experience, specifically including AWS Reds
  • hiftAWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickS
  • ightExperience with Kafka/Messaging preferably Confluent K
  • afkaExperience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Au
  • roraExperience with Amazon Redshift for AWS data warehou
  • singProven track record in the design and implementation of data warehouse solutions using
  • AWSSkilled in data modeling and executing ETL processes tailored for data warehou
  • singCompetence in developing and refining data pipelines within
  • AWSProficient in handling both real-time and batch data processing t
  • asksExtensive understanding of database management fundamen
  • talsExpertise in creating alerts and automated solutions for handling production prob
  • lemsTools and Languages – Python, Spark, PySpark and Pa
  • ndasInfrastructure as Code technology – Terraform/CloudForma
  • tionExperience with Secrets Management Platform like Vault and AWS Secrets man
  • agerExperience with Event Driven Architec
  • tureDevOps pipeline (CI/CD); Bitbucket; Conco
  • urseExperience with RDBMS platforms and Strong proficiency with
  • SQLExperience with Rest APIs and API gat
  • ewayDeep knowledge of IAM roles and Poli
  • ciesExperience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch ev
  • entsDeep understanding of networking DNS, TCP/IP and
  • VPNExperience with AWS workflow orchestration tool like Airflow or Step Funct

ionsPREFERRED SKI

  • LLS:Experience with native AWS technologies for data and analytics such as Kinesis, OpenSe
  • archDatabases - Document DB, Mong
  • o DBHadoop platform (Hive; HBase; Dr
  • uid)Java, Scala, Nod
  • e JSWorkflow Automa
  • tionExperience transitioning on premise big data platforms into cloud-based platforms such as
  • AWSStrong Background in Kubernetes, Distributed Systems, Microservice architecture and contai

nersADDITIONAL REQUIREME

  • NTS:Ability to perform hands on development and peer review for certain components / tech stack on the pro
  1. ductStanding up of development instances and migration path (with required security, access/ro
  2. les)Develop components and related processes (e.g. data pipelines and associated ETL processes, workfl
  3. ows)Lead implementation of integrated data quality frame
  4. workEnsures optimal framework design and load testing scope to optimize performance (specifically for Big D
  5. ata)Supports data scientist with test and validation of mo
  6. delsPerforms impact analysis and identifies risk to design cha
  7. ngesAbility to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applicat
  8. ionsAbility to implement data pipelines with the right attentiveness to durability and data qua
  9. lityImplements data warehousing products thinking of the end users experience (ease of use with right performa
  10. nce)Ensures Test Driven develop
  • ment5+ years of Experience leading teams to deliver complex prod
  • uctsStrong technical skills and communication sk
  • illsStrong skills with business stakeholder interact
  • ionsStrong solutioning and architecture sk
  • ills5+ years of Experience building real time data ingestion streams (event dri
  • ven)Ensure data security and permissions solutions, including data encryption, user access controls and log


ging
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local


laws.

Location: Charlotte, NC, US

Posted Date: 10/28/2024
Click Here to Apply
View More Brooksource Jobs

Contact Information

Contact Human Resources
Brooksource

Posted

October 28, 2024
UID: 4906578991

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.