← Back to jobs

Data Engineering Specialist

Leidos
FULL_TIME Remote · US Scott AFB, IL, United States, IL, US Posted: 2026-05-11 Until: 2026-07-11
Apply Now →
You will be redirected to the original job posting on BeBee.
Apply directly with the employer.
Job Description
Description The Leidos Digital Modernization Group seeks a Data Engineering Specialist to support the Global Management System (GMS) Team for the Global Solutions Management – Operations II (GSM-O II) contract. This contract includes the Operations, Sustainment, Maintenance, Repair and Defense of the Defense Information System Network (DISN) within the DOD Information Network (DODIN) in support of the Defense Information Systems Agency (DISA). It also includes support for other key tasks for DISA, including the transformation of DISA’s operational mission through innovation, and support to DISA’s mission partners. The candidate must be within commuting distance of Scott AFB or Ft. Meade. At a minimum, a Secret clearance and Security + certification (or other applicable DoD 8570 IAT II certification) upon the start of employment. The candidate will support data engineering activities, contributing to the integration and enrichment of DISN network topology data to enable advanced data correlation and analytics. They will assist in designing and implementing data enrichment pipelines, integrating multiple data sources into Confluent (Kafka) and Elastic platforms, and help maintain Kafka and Elastic clusters to support mission-critical operations. The candidate will contribute to platform sustainment and reliability by addressing operational challenges and supporting the automation of the software development lifecycle, including CI/CD pipeline development, containerization, and automated testing, while following DevOps best practices. The role involves active participation in Agile scrum teams, collaborating with team members, and sharing knowledge to support team growth. Additionally, the candidate will help develop and maintain technical documentation, ensuring solutions align with DoD security standards and compliance requirements. As a GMS team member, you will work as part of a fast paced, Agile development and implementation team to architect, design and develop an integrated solution that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a matrixed organization across the project. Primary Responsibilities: Contribute to data engineering efforts by supporting the integration and enrichment of DISN network topology data for advanced data correlation and analytics. Participate in technical discussions with internal and external stakeholders to support solution design and implementation. Develop, test, and deploy data pipelines and integration solutions across distributed systems and cloud environments, using Python, JavaScript, Java, and SQL. Assist in requirements gathering and collaborate with stakeholders to design and implement data enrichment pipelines, integrating diverse data sources into Confluent (Kafka) and Elastic platforms. Develop and maintain Kibana visualizations and dashboards to support operational insights. Support Kafka system integrations between Elasticsearch/Logstash and other systems. Collaborate within Agile scrum teams, contribute to team deliverables, and share knowledge with peers. Communicate and coordinate effectively with geographically distributed team members to achieve project objectives. Troubleshoot and help resolve installation, infrastructure, and system issues; report and help mitigate technical risks. Develop and maintain technical documentation, including DoD requirements, interface documents, and security compliance artifacts. Ensure solutions comply with DoD security standards and guidelines, and support platform sustainment and reliability by addressing operational challenges as needed Basic Qualifications: Bachelor’s degree from an accredited college in a related discipline, or equivalent experience/combined education, with 4–8 years of professional experience; or 2–6 years of professional experience with a related master’s degree. 4+ years of experience in software engineering, data engineering, or business/data analysis, preferably within Agile/Scrum teams. Hands-on software development experience with Python, Java, SQL, and working knowledge of JavaScript and HTML. Experience with distributed version control systems such as Git and Bitbucket. Proficiency with data analytics and visualization tools, such as Kibana, Power BI, Tableau, and the ELK stack (Elasticsearch, Logstash, Kibana). Experience designing, developing, and optimizing ETL processes and data pipelines, including integration with event streaming platforms like Kafka. Background in data modeling, unification, and analytics to support data-driven projects. Experience implementing application and system integrations, including Kafka and Elastic platform integrations. Understanding of networking