Performance Data Engineer

Our federal client is looking to hire a Performance Data Engineer
at their site in Raleigh, NC. Please send your resume in WORD
format should you be interested in exploring this role. The
position is on a long-term contract basis and will pay an hourly
rate of $62.50/HR to $85/HR 1099. We also offer the option of
$45/HR to $72.50/HR or $115K to $140K W2, dependent on level of
experience, with full benefits and PTO. After a 2-month
orientation period the position will allow for a hybrid work week
with 2 days a week remote/3 onsite.

Job Title: Performance Data Engineer

Work Environment: Raleigh, NC (Hybrid work schedule available
after initial 2-month orientation period, 2 days a week remote)

Pay Rate: (W2) $62.50/HR to $85/HR 1099, $45/HR to $72.50/HR or
$115K to $140K W2.

Term: Contract

3rd Party C2C/Transfer: No

Referral Fee: $500 – Refer qualified colleague, friends, or

***Due to the required clearance with our government client, only
US Citizens, Green Card and Green Card EAD holders may be
considered. Clearance requires that the candidates have resided
in the US for the past five years. The selected candidate cannot
have left the country for longer than 90 consecutive days and no
more than 180 cumulative days***

Job Description:

The knowledge, skills, and abilities with the Teradata relational
database must be far and above that typically found in
application developers. Significant experience with the Teradata
optimizer is required; the ability to read SQL “explain” plans
and take action to effectively utilize the massive parallel
nature of the database engine. A thorough understanding of the
internal “DBC Accounting” information is expected in order to
effectively measure the performance of application code.

The following additional skills are desired, but are not
considered requirements:

* A working knowledge of how Microstrategy and other OLAP tools
interact with Teradata.

* A working knowledge of the major competing Data Warehouse
modeling techniques (Third-Normal-Form and Dimensional modeling)
and how data modeling decisions affect the performance of ETL,
structured report queried, and unstructured data analysis.

* A working knowledge of Web API development.

* A working knowledge of storing, reading, and analyzing
streaming data – i.e., kafka, mqseries, or similar technologies,
such as JMS.

* A working knowledge of Cloud technology – developing and

* Ab Initio ETL Coding in GDE
* Ab Initio Metadata Hub Lineage
* Ab Initio TRMC
* ANSI SQL and Teradata SQL extensions
* Teradata SQL Assistant (a.k.a. QueryMan) for EDS Support
* Teradata Utilities for EDS Support:
* FastLoad
* MultiLoad
* FastExport
* TPump

* Azure/Databricks Support:
* Cloud operations
* Cloud Governance
* Data Engineering
* Platform Engineering
* Data Integration
* Data Observability

In addition, the following UNIX skills are required:

* UNIX commands and concepts in order to navigate source code
directories, find error logs, perform impact analysis
assessments, edit code, version files
* Linux Shell scripting in order to read and create driver

In addition, the following Ab Initio skills are required:

* A working knowledge of how Ab Initio conducts data extract,
transformations, and loading – especially to and from Teradata
* Development of graphs, plans, PSETs and how to develop tests
and debug
* PDL scripting

Job Location