Job Duties: Design and develop database and application solution strategy, technical design, architecture, and support for carrying out the implementation of enterprise database and data warehouse development. Build and maintain code to populate HDFS, Hadoop with log events from real-time data feeds or data loaded from SQL production systems. Responsible for assessing the impact of external production system changes to Big Data systems on Spark to implement changes to the ETL to ensure consistent and accurate data flows. Build hive scripts to ingest millions of records and responsible for performance tuning of these scripts. Use Sqoop to populate the data from RDBMS to HDFS and develop Hive/pig scripts to perform analysis. Create, develop and support Business Intelligence initiatives using Informatica PowerCenter. Deliver advanced and complex reporting solutions including Dashboards, Standardized Reporting using OBIEE. Perform integration testing, production deployment through GitHub. Write stored procedures, packages, functions, and views in PL/SQL. Will work in Charlotte, NC and/or various client sites throughout the U.S. Must be willing to travel and/or relocate.
Employer: Pantar Solutions, Inc
Location: 112 South Tryon Street, Suite 755, Charlotte, NC and/or various client sites throughout the U.S.
Apply to: Pantar Solutions, 112 South Tryon Street, Suite 755, Charlotte, NC 28284
This notice is provided as a result of the filing of an application for permanent alien labor certification for this job opportunity in compliance with 20 CFR 656.10(d). Any person may provide documentary evidence bearing on this application to:
Certifying Officer, U.S. Department of Labor
Employment and Training Administration
Atlanta Processing Center
233 Peachtree Street, Suite 410
Atlanta, GA 30303