Cloud Data Engineer
Company: Access Data Consulting Corp
Posted on: February 15, 2019
1. Python - needs to be an expert in this 4/5 level. They use python for data movement and transformation 2. Spark and spark eco system (transformation and manipulation 3. Scala (Spark) 4. Cloud datawarehouse experience - Snowflake is a plus, but Redshift or Google BigQuery would be good 5. Apache AirFlow 6. AWS experience and AWS Tools - Lamda, Kinesis, ECS (Elastic), EC2, S3 7. CI/CD - continuous integration, continuous delivery 8. Data lakes 9. SQL Experience (data migration, modeling, analysis)Actual job description:---Analyze data from multiple data sources and develop processes to integrate the data into a single but consistent view.---Develop, orchestrate and monitor complex ETL/ELT workflows using Python, AWS step-functions, Lambda coordinators and runner functions, Glue, Airflow, and/or other tools.---Implement robust data pipelines for batch and real-time analytics at scale, using a modern "Lambda" style architecture, ideally with the AWS toolset.---Develop custom Kinesis applications using the Kinesis API, KCL, and KPL, using Lambda polling to transform/aggregate in-stream data, and building tooling to monitor and dynamically re-shard as needed.---Light API development using AWS API Gateway, Kinesis, and Lambda to process push-based events at scale (200K+ records per second).---Implement serverless AWS architectures leveraging Lambda with state management via ElastiCache, Memcached, Redis, DynamoDB, etc. with careful consideration for monitoring, logging and error handling.---Infrastructure as code using Cloudformation or Terraform.---Implementation of modern CI/CD pipelines for automated code deployment/testing.---Ability to work heads down and independently on projects to meet tight deadlines with accurate results.---Strong communication and team building skills.---Ability to work with highly fluid requirements that may result in re-work of development items, as new data relationships are identified.---Experience with .Net, C#, Python, PowerShell, stored procedures, or other development tools required.---Expert level Python for ETL.---Expert with AWS toolset (EC2, Containers, Lambda, Networking).---Expertise with Snowflake a big plus. Tacit knowledge of Redshift and Google BigQuery. - provided by Dice CLOUD,AWS, Python, SQL Server,Scala, CI/CD
Keywords: Access Data Consulting Corp, Denver , Cloud Data Engineer, Engineering , Englewood, Colorado
Didn't find what you're looking for? Search again!