Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Pyspark ETL Scripts Docker Experince GIT Shell Scripting Release Management DevOps
 
Requirement id 147964
Job title Engineer
Job location in Madison, WI
Skills required Aws Services, Python Scripting, Pyspark, ETL Scripts Docker Experince GIT Shell S
Open Date 14-Jun-2024
Close Date
Job type Contract
Duration 6 Months
Compensation DOE
Status requirement ---
Job interview type ---
   Email Recruiter: coolsoft
Job Description Engineer: Aws Services, Python Scripting, Pyspark, ETL Scripts Docker Experince GIT Shell S

Maximum Submissions per Supplier : 2

Start date : 7/8/2024

End date : 01/08/2025

Note: * Submission deadline : 6/20/2024 by 8:00 A.M. CST

Client info: Madison, WI - IT - UW - Madison –Department of Information Technology (DoIT)

* Candidates must be current Wisconsin residents.

Interview Process:

• Remote, Microsoft Teams, Camera On

Duration:

• 6 Months from projected start date

Onsite or Remote?

• This is 100% remote position (within Wisconsin)
• Candidates MUST be WI residents or willing to relocate to WI at the candidates expense.

Additional Details:

• Remote position, possibility of 6-month extension.

AWS Engineer:

The Universities of Wisconsin Enterprise Analytics Platform (EAP) team supports the development and support of the data lake for the UW System Office and member universities.

Job Overview:

We are seeking an experienced AWS Redshift Data Engineer to assist our team in designing, developing, and optimizing data pipelines for our AWS Redshift-based data lake house.
Priority needs are cloud formation and event-based data processing utilizing SQS to support ingestion and movement of data from Workday to AWS Redshift for consumer and analytic use.

Key Responsibilities:

• Collaborate with data engineering, business analysts, and development teams to design, develop, test, and maintain robust and scalable data pipelines from Workday to AWS Redshift.
• Architect, implement, and manage end-to-end data pipelines, ensuring data accuracy, reliability, data quality, performance, and timeliness.
• Provide expertise in Redshift database optimization, performance tuning, and query optimization.
• Assist with design and implementation of workflows using Airflow.
• Perform data profiling and analysis to troubleshoot data-related challenges / issues and build solutions to address those concerns.
• Proactively identify opportunities to automate tasks and develop reusable frameworks.
• Work closely with version control team to maintain a well-organized and documented repository of codes, scripts, and configurations using Git/Bitbucket.
• Provide technical guidance and mentorship to fellow developers, sharing insights into best practices, tips, and techniques for optimizing Redshift-based data solutions.
Required Qualifications and Skills:
• Advanced hands-on experience designing AWS data lake solutions.
• Experience integrating Redshift with other AWS services, such as DMS, Glue, Lambda, S3, Athena, Airflow.
• Proficiency in Python programming with a focus on developing efficient Airflow DAGs and operators.
• Experience with PySpark and Glue ETL scripting including functions like renationalize, performing joins and transforming data frames with PySpark code.
• Competency developing CloudFormation templates to deploy AWS infrastructure, including YAML defined IAM policies and roles.
• Experience with Airflow DAG creation.
• Familiarity with debugging serverless applications using AWS tooling like CloudWatch Logs & Log Insights, CloudTrail, IAM.
• Ability to work in a highly complex python object-oriented platform.
• Strong understanding of ETL best practices, data integration, data modeling, and data transformation.
• Proficiency in identifying and resolving performance bottleneck and fine-tuning Redshift queries.
• Familiarity with version control systems, particularly Git, for maintaining a structured code repository.
• Strong coding and problem-solving skills, and attention to detail in data quality and accuracy.
• Ability to work collaboratively in a fast-paced, agile environment and effectively communicate technical concepts to non-technical stakeholders.

Additional Useful Experience:
 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 147964 while calling.
 
Other jobs in WI: De Pere (2), Green Bay (8), Kenosha (2), Madidson (1), Madison (36), Menomonee Falls (19), Milwaukee (59), Sun Prarie (1), Unknown (2), Waukesha (2),
Aws Services job openings in Madison, WI
Jobs List

Technical Architect III -129549
Create date: 18-Apr-2024
No of Submission : 1

state date : 05/06/2024

End date : 1 Years from projected start date

Submission Deadline : 4/22/2024 @4:00PM CST.

Client Info : UW - Madison

Note :

* Interview Process: Remote, Microsoft Teams

* Duration: 1 year from start date

* Onsite.... (This job is for - job Jobs in WI Madison Architect - (in Madison, WI))

Certified Cloud Architect - 115030 - SP
Create date: 09-Mar-2023
start date : 04/03/2023

End date : 6 Months from projected start date

submission deadline : 3/16 @ 4:00PM CST.

client info : UW - Madison

Note :

* Interview Process: Remote, MicrosoftTeams

* Duration: 6 Months from projected start date with the possibility of extending another 6.... (This job is for - job Jobs in WI Madison Architect - (in Madison, WI))

Certified Cloud Engineer-114643
Create date: 27-Feb-2023
Start date : 03/27/2023
End Date : 06/30/2023

Submission deadline :Friday, March 3rd @ 4:00PM CST.

Client Info : UW - Madison

Note:
* Interview Process: Remote, Microsoft Teams

* Duration: Until 6/30/2023, Likely to extend 1 year

* Onsite or Remote? Remote

Description : .... (This job is for - Development Jobs in WI Madison Consultant - (in Madison, WI))
 
 Aws Services job openings in other states
Jobs List

Cloud Engineer -128086
Create date: 11-Mar-2024
start date : 04/01/2024

End date : 1 Years from projected start date

submission deadline :3/18 at10am EST.

client info : DHHS

Note:

* Interview Process: Virtual Interview via MS Teams video. Please use laptop and be prepared so share screen if asked. Use of headphones is strongly discourag.... (This job is for - Kubernetes Jobs in MI Lansing Engineer - (in Lansing, MI))
 
 
(Engineer: Aws Services, Python Scripting, Pyspark, ETL Scripts Docker Experince GIT Shell S in Madison, WI)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us