ITility, LLC

Senior Cloud Data Engineer – AWS

Posted Date 5 hours ago(8/8/2025 10:30 AM)
Job ID
2025-3508
# of Openings
1
Job Locations
US

Overview

Join our dynamic ITility team and put your skills and passion to work! We're looking for a Data Engineer to join our Data aNd Analytics (DNA) team. You’ll be instrumental in designing, developing, and maintaining modern data infrastructure to support business intelligence, data exchange, and a scalable data ecosystem hosted in AWS. This position supports a new development initiative for our client the Chief Data Officer and their transformation effort, building a semantic data layer and an enterprise data warehouse that brokers data across internal and external systems. This is a remote position with occasional travel up to 10%, which may include occasional visits to client sites or government installations.

 

You’ll work on our prime contract with USMEPCOM, a key U.S. Department of Defense organization responsible for screening and processing applicants into the Armed Forces. USMEPCOM operates 65 Military Entrance Processing Stations (MEPS) across the country, serving as the critical link between recruitment and training.

 

At ITility, we help our customers command the future by thinking beyond perceived limits to create new, unexpected ways to protect and defend our nation. We inspire and empower people to create significant solutions that secure what matters to our customers and communities, here and around the globe.

We Value:

  • The Drive to Perform Beyond Perceived Limits.
  • The Desire to Find Significance in All We Do.
  • The Passion and Compassion That Powers Both.

Join us in building the data-driven future of federal IT. Apply your expertise to mission-driven work that protects and supports those who serve our nation.

Responsibilities

Key Responsibilities:

  • Design & Development:
    • Architect and implement scalable data solutions using Amazon RDS, S3, Redshift, Glue, Lambda, and Python.
    • Write, optimize, and maintain complex SQL queries for data retrieval and transformation.
    • Develop robust, scalable data pipelines for ingestion and processing across AWS services.
    • Support machine learning model deployment and inference pipelines using AWS SageMaker.
  • Analytics & Visualization Support:
    • Enable data visualization by integrating data sources with AWS QuickSight and Microsoft Power BI.
    • Collaborate with analysts and stakeholders to ensure datasets are properly structured, accessible, and aligned with business logic.
  • ETL/ELT Pipelines:
    • Design and manage ETL/ELT workflows to ingest and transform data from multiple sources.
    • Utilize AWS Glue, Lambda, and custom Python/SQL scripts for orchestration.
  • Performance Optimization:
    • Monitor and tune Amazon RDS performance.
    • Refine data models, indexing strategies, and query execution plans.
  • Collaboration & Communication:
    • Partner with analysts, data scientists, and stakeholders to define data requirements.
    • Maintain technical documentation and assist in knowledge sharing across teams.

Qualifications

Required Skills and Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or related discipline.
  • 8+ years of experience as a Data Engineer, Database Developer, or similar role.
  • AWS Certification (e.g., AWS Certified Data Engineer, Solutions Architect, or Data Analytics).
  • CompTIA Security+ certification.
  • Strong proficiency in SQL and Python (including libraries like pandas, numpy, and PySpark).
  • Hands-on experience with Amazon RDS setup, maintenance, and scaling.
  • Experience implementing and integrating AWS QuickSight for dashboards and analytics.
  • Proficiency working with Power BI, including report design and data modeling.
  • Experience with AWS services: Glue, Lambda, S3, Athena, Redshift, SNS, API Gateway.
  • Experience implementing machine learning workflows and model inference in AWS applications using tools like SageMaker.
  • Strong understanding of data modeling, normalization, and warehouse design.
  • Experience using CI/CD pipelines (Git, CodePipeline, etc.) for data engineering projects.
  • Familiarity with big data frameworks like Apache Spark or Hadoop.
  • Exposure to NoSQL databases (e.g., DynamoDB) and data streaming tools (e.g., Kafka, Kinesis).

Preferred Qualifications:

  • Master’s in Data Science from an accredited university.
  • Prior experience with DoD or Federal Government projects.
  • Background in DoD networks, ATO processes, or application security practices.

Soft Skills:

  • Excellent verbal and written communication skills.
  • Strong analytical and problem-solving mindset.
  • Ability to work collaboratively as part of a team and independently when needed.
  • Adaptable and eager to learn new tools and technologies.

Physical Requirements:

  • Ability to sit or stand for extended periods while performing computer-based tasks.
  • Regular use of hands for typing, writing, and handling office equipment; frequent talking, hearing, and seeing.
  • Occasional movement around the office, including climbing stairs.
  • Ability to travel up to 10%, which may include occasional visits to client sites or government installations.

ITility is an Equal Opportunity Employer

ITility is committed to providing a work environment that is non-discriminatory, harassment free, fair, ethical and inclusive.


ITility is committed to the principle of equal employment opportunity, and complies with all applicable laws which prohibit discrimination and harassment in the workplace. ITility strictly prohibits discrimination or harassment based on race, color, religion, national origin, sex, age, disability or any other characteristic protected by law in all terms, conditions and privileges of employment, including without limitation, recruiting, hiring, assignment, compensation, promotion, discipline and termination. This policy covers conduct occurring at ITility’s offices, client sites, other locations where ITility is providing services, and to all work-related activities.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed