ETL Developer/Data Engineer
Their role is to develop, enhance and support data pipelines using Informatica Power center, AWS and Snowflake.
Contractor will be working on the Modernizers team, EAM BI team. Their role is to develop, enhance and support data pipelines using Informatica Power center, AWS and Snowflake.
• Designs and develops code and data pipelines to ingest from relational databases (Oracle, SQL Server, DB2, Aurora), file shares, and web services.
• Design and build Informatica Power Center mappings and workflows
• Streaming ingestion with Kinesis Streams, Kinesis Firehose, Kinesis Analytics and Kafka (MSK)
• Build Data Lake on AWS S3 with optimal performance considerations by partitioning and compressing data.
• Data Engineering and Analytics using AWS Glue, Informatica, EMR, Spark, Athena, Python.
• Data modeling and building Data Warehouse using Snowflake.
• Designs and develops code and data pipelines to ingest relational databases, file shares, and web services.
• Develops MarkLogic integrations with existing enterprise platforms.
• Participates in requirements definition, system architecture design, and data architecture design.
• Participates in all aspects of the software life cycle using Agile development methodologies.
• Build and design Informatica Power Center mappings and workflows
• Bachelor’s degree in Computer Science, Computer Information Systems, Engineering, Statistics or closely related field (willing to accept foreign education equivalent) (required).
• Experience in AWS services for data and analytics (required).
• 5 years of experience in Data Ingestion, Data Extraction, and Data Integration (required).
• 5+ years of experience in Enterprise Information Solution Architecture, Design, and development required.
• 5+ years of experience with integration architectures such as SOA, Microservices, ETL or other integration technologies.
• 5+ years of experience with working content or knowledge management systems, search engines, relational databases, NoSQL databases, ETL tools, geospatial systems, or semantic technology.
• 3+ years of hands-on experience with a MarkLogic framework, DynamoDB preferred.
• 3+ years of hands-on experience with AWS services ( S3, Kinesis, Lambda, Athena, Glue, EMR) required.
• 3+ years of analytics tools like SAS, R, Python, and other advanced statistical software.
• Experience with JSON or XML data modeling required.
• Experience with Git/GitHub, branching, and other modern source code management methodologies required.
• Domain knowledge of NoSQL or relational database required.
• Understanding of database architecture and performance implications required.
• Experience with Machine Learning and Artificial Intelligence.
• Ability to multi-task effectively.
• Ability to work collaboratively as part of an Agile Team.
• Excellent written and verbal communication skills, sense of ownership, urgency and drive.
We look forward to reviewing your application. We encourage everyone to apply - even if every box isn’t checked for what you are looking for or what is required.
PDSINC, LLC is an Equal Opportunity Employer.