Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Workflow ETL Pyspark Hadoop ETl Administration Hive UNIX Administration UAT Bachelo
 
Requirement id 120717
Job title Developer
Job location in Columbus, OH
Skills required Data Warehouse, Data Analytics, Workflow, ETL Pyspark Hadoop ETl Administration Hi
Open Date 07-Dec-2021
Close Date
Job type Contract
Duration 12 Months
Compensation DOE
Status requirement ---
Job interview type ---
   Email Recruiter: coolsoft
Job Description Developer: Data Warehouse, Data Analytics, Workflow, ETL Pyspark Hadoop ETl Administration Hi

Note:

* Candidates must be local to central Ohio or relocated by day one for consideration.

Description:

The Big Data Warehouse Developer will be responsible for data warehouse design, development, implementation, migration, maintenance and operation activities. The Big Data Warehouse Developer will be one of the key technical resource for data warehouse projects for various data warehouse projects and building critical data marts, data ingestion to Big Data platform for data analytics and exchange.

Responsibilities:


Follow the organization coding standard document, create mappings, sessions and workflows as per the mapping specification document.

Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements.

Create jobs in Hadoop using SQOOP, PYSPARK and Stream Sets to meet the business user needs.Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX shell scripts.

Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.

Updating the production support Run book, Control M schedule document as per the production release.

Create and update design documents, provide detail description about workflows after every production release.

Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.

Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.

Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.

Participate in ETL/ELT code review and design re-usable frame works.

Create Remedy/Service Now tickets to fix production issues, create Support Requests to deploy Database, Hadoop, Hive, Impala, UNIX, ETL/ELT and SAS code to UAT environment.

Create Remedy Service Now tickets and/or incidents to trigger Control M jobs for FTP and ETL/ELT jobs on ADHOC, daily, weekly, Monthly and quarterly basis as needed.

Model and create STAGE / ODS / Data warehouse Hive and Impala tables as and when needed.

Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation post deployment.

Work with Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.

Requirements

8+ years of experience with Big Data, Hadoop on Data Warehousing or Data Integration projects.

Required Education

BS/BA degree or combination of education & experience

Desired Skills:

Create re-usable UNIX shell scripts for file archival, file validations and Hadoop workflow looping.

Create re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.

Create PySpark programs to ingest historical and incremental data.

Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.

Participate in meetings to continuously upgrade the Functional and technical expertise.

Analysis, Design, development, support and Enhancements of ETL/ELT in data warehouse environment with Cloudera Bigdata Technologies (with a minimum of 7 or more years experience in Hadoop, MapReduce, Sqoop, PySpark, Spark, HDFS, Hive, Impala, StreamSets, Kudu, Oozie, HUE, Kafka, Yarn, Python, Flume, Zookeeper, Sentry, Cloud
 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 120717 while calling.
 
Other jobs in OH: Cincinnati (1), Cleveland (1), Columbus (233), Dublin (8), Solon (2),
Data Warehouse job openings in Columbus, OH
Jobs List

IS Technical Specialist IV Lead-20230227041907
Create date: 28-Feb-2023
Start date :3/20/2023

End Date :6+Months from the start date

Submission deadline :03-06-2023

Note:

* Job Site On-Site

Description :

At least 3 years of experience in ETL development (Pyspark).
At least 5 years of experience writing SQL queries, troubleshooting.
.... (This job is for - Python Jobs in OH Columbus Specialist - (in Columbus, OH))

INS-IT Consultant 1/ ITC1 (687972)
Create date: 18-Jul-2022
Start date :08/01/2022

End Date :06/30/2023

Submission deadline :07/21/2022

Client Info :INS

Description :

Preferred Education
College Degree

Insurance is seeking a Data Analyst to join our IT Team. We are seeking someone who has Data warehouse experience with ETL data flows, .... (This job is for - Business Analyst Jobs in OH Columbus Consultant - (in Columbus, OH))

Senior Database Architect - 58066
Create date: 23-Apr-2021
Description:

Responsibilities

Our Client in Columbus, Ohio is seeking an experienced Sr. Database Architect for a contract role

CANDIDATES MUST BE LOCAL TO CENTRAL OHIO FOR CONSIDERATION

The Senior Database Architect is responsible for designing, developing, and implementing infrastructure to provide high.... (This job is for - Hadoop Jobs in OH Columbus Architect - (in Columbus, OH))

OFSAA Developer - 70803
Create date: 12-Jan-2021
Candidate must be our W2 Employee.

Job Description:

Title: Programmer Analyst

Location: Columbus, OH

Position Description:

Client is seeking an OFSAA (Oracle Financial Services Analytical Applications) Developer/Responsible Technology Manager for its internal Profit Plus application. The Profit.... (This job is for - job Jobs in OH Columbus Developer - (in Columbus, OH))

Database Engineer - 46296
Create date: 22-Apr-2019
Job Description :

Access Management Engineer


The Enterprise Information Management team is seeking an Access Management Engineer for the Data Platform. The primary responsibility of the role is to implement role based access layer for EDW and other tools by business function. The access management engineer will be respon.... (This job is for - DBA Jobs in OH Columbus Engineer - (in Columbus, OH))
 
 Data Warehouse job openings in other states
Jobs List

Certified Project Manager IV/Sr. -136881(REBID)
Create date: 31-Oct-2024
No of Submissions : 1

Start date : 11/18/2024

End Date : 06/30/2025

Submission deadline : 11/4/2024 4:00:00 PM

Client Info : ETF

Note:

* Interview Process: MS Teams

* Duration of Contract: 6/30/2025, with potential for extension

* Onsite or Remote: Candida.... (This job is for - BI Jobs in WI Madison Manager - (in Madison, WI))

GTRI Sr. Data Architect (748151)
Create date: 01-Oct-2024
No of Submissions : 2

Start date : 11/04/2024

End Date : 06/30/2025

Submission deadline : 10/04/2024

Client Info : GTRI-Enterprise Systems Department

Note:

* Agency Interview Type : Either Web Cam or In Person

* Hybrid

* Plans and schedules team resources, sta.... (This job is for - ArchitectingAnalytics Jobs in GA Atlanta Architect - (in Atlanta, GA))

QA Test Analyst Advanced -63258
Create date: 20-Sep-2024
No of Submissions : 2

start date : 10/14/2024

End date :10/31/2026

submission deadline : 09/24/2024

client info : NeDHHS

Note:

* Position Location: Remote

Description :

We are seeking a highly skilled ETL Test.... (This job is for - job Jobs in NE Lincoln Analyst - (in Lincoln, NE))

Database Administrator IV - 11119-24
Create date: 12-Aug-2024
No of Submissions : 1

start date : 09/16/2024

End date :09/12/2025

submission deadline : 09/01/2024

client info : DHS / OHA

Bill Rate :USD 35.50 Per Hour

Max Rate :USD 100.50 Per Hour


Note:
* Position Location : Remote
.... (This job is for - Cobol Jobs in OR Salem Administrator - (in Salem, OR))

Data Warehouse Developer III -130752
Create date: 21-May-2024
No of Submissions : 1

start date : 07/01/2024

End date : 06/30/2025

submission deadline : 5/22/24 at 4pm CST

client info : DCF

Note:

* INTERVIEW NOTES:
• Phone and web based interviews will be acceptable.
• However, an in--person interview may be required.
• Int.... (This job is for - job Jobs in WI Madison Developer - (in Madison, WI))
 
 Data Warehouse job openings in OH
Jobs List

Hadoop Developer - Data Warehouse - J-11-383-937
Create date: 12-Feb-2019
Description: Position: Hadoop Developer with strong Hive,Pig,Sqoop experience

Role:
Hadoop Developer on Data Warehouse Team to work on the Build Out for a Solution project. At a high-level this team works with moving data from Hadoop to PDW using Syncsort with the main experience required being Hadoop. Once in PDW, SQL is used for q.... (This job is for - SSIS Jobs in OH MayfieldHeights Developer - (in Mayfield Heights, OH))

IT Data Warehouse Developer II - J-11-324-737
Create date: 06-Jun-2018
Description: IT Data Warehouse Developer II

Description:

Description:

The position will be a part of the Advanced Analytics Automation team (Big Data - Hadoop).

.... (This job is for - ETL Jobs in OH Gahanna Developer - (in Gahanna, OH))

IT Data Warehouse Developer IV - J-11-324-094
Create date: 25-May-2018
Status

Submitted

Next Step

Respond

Service Type

Temp

Buyer

.... (This job is for - job Jobs in OH Gahanna Developer - (in Gahanna, OH))

Business Info Developer Consultant - 33245
Create date: 30-Mar-2018
Candidate must be our W2 Employee

Job Description :


Apex Systems, the nations 2nd largest IT Staffing organization, has an immediate opportunity for a Business Info Developer with an organization located in Mason, OH. This is a long term contract role.

Responsible for developing and executing more comple.... (This job is for - Analysis Jobs in OH Mason Consultant - (in Mason, OH))

DataWarehouse QA Lead - ETL SSIS - J-11-266-502
Create date: 22-Sep-2017
Start Date : 10/02/2017
End Date : 03/31/2018
Submission Dead Line : 09/28/2017
Description :

Experis is looking for a Data Warehouse QA Lead (ETL & SSIS) 6+month contract only position based in Cleveland, OH!

**Sponsorship and Approved 3rd/Party Subs...OK**

ROLE:
QAA Lead .... (This job is for - QA SSIS Jobs in OH CLEVELAND Developer - (in Cleveland, OH))
(Developer: Data Warehouse, Data Analytics, Workflow, ETL Pyspark Hadoop ETl Administration Hi in Columbus, OH)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us