Please scroll down, To apply

Data Engineer with Security Clearance

hiring now

BigBear.ai

2024-11-06 01:44:04

Job location Washington, District of Columbia, United States

Job type: fulltime

Job industry: I.T. & Communications

Job description

Overview BigBear.ai is seeking a Data Engineer to join our team and help us build and maintain scalable and reliable data pipelines for our clients. You will be responsible for developing and designing data pipelines to support an end-to-end solution, integrating data pipelines with AWS cloud services to extract meaningful insights, and providing Tier 3 technical support for deployed applications and dataflows. What you will do Develop and design data pipelines to support an end-to-end solution Develop and maintain artifacts i.e., schemas, data dictionaries, and transforms related to ETL processes Integrate data pipelines with AWS cloud services to extract meaningful insights Manage production data within multiple datasets ensuring fault tolerance and redundancy Design and develop robust and functional dataflows to support raw data and expected data Provide Tier 3 technical support for deployed applications and dataflows Collaborate with the rest of data engineering team to design and launch new features. Includes coordination and documentation of dataflows, capabilities, etc. What you need to have Must maintain an active Top Secret with SCI eligibility clearance and be able to work in a fast-paced and collaborative environment. You will also need to have a Bachelor's Degree and 2+ years of experience; or 8 to 10 years of relevant experience with no degree. Proficiency in data engineering, ETL architecture and development, and end-to-end processes Experience in working with AWS cloud services, such as S3, EC2, Lambda, Glue, Athena, etc. Experience in database administration and API development Knowledge of data pipeline tools and frameworks, such as Airflow, Spark, Kafka, etc. Ability to design and develop robust and functional dataflows to support raw data and expected data Ability to develop and maintain artifacts, such as schemas, data dictionaries, and transforms related to ETL processes Ability to collaborate with the rest of data engineering team to design and launch new features, including coordination and documentation of dataflows, capabilities, etc. Ability to troubleshoot and resolve complex technical issues What we'd like you to have Experience with cloud message APIs and usage of push notifications will be a plus for consideration Keen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national security About BigBear.ai BigBear.ai delivers AI-powered analytics and cyber engineering solutions to support mission-critical operations and decision-making in complex, real-world environments. BigBear.ai's customers, which include the US Intelligence Community, Department of Defense, the US Federal Government, as well as customers in manufacturing, healthcare, commercial space, and other sectors, rely on BigBear.ai's solutions to see and shape their world through reliable, predictive insights and goal-oriented advice. Headquartered in Columbia, Maryland, BigBear.ai is a global, public company traded on the NYSE under the symbol BBAI. For more information, please visit: and follow BigBear.ai on .

Inform a friend!

<!– job description page –>
Top