Lakeview Loan Servicing, LLC.
Job Description
Overview The Data Engineer, Mortgage Servicing on the Nebula team acts as the mortgage servicing data subject matter expert and plays a critical role in building and evolving the data foundation that powers analytics, reporting, AI development, and operational decision-making across the organization. This role is responsible for designing, building, and maintaining reliable, scalable, and flexible data systems that support a wide range of internal and external use cases. This role requires domain awareness in mortgage and servicing-related data environments, with an understanding of the complexities associated with loan-level lifecycle data, transaction processing, cash movement, and reconciliation across systems. The Data Engineer must be able to translate business workflows and system behavior into accurate, auditable data structures that support downstream reporting, operational processes, and regulatory requirements. This role contributes to the development and evolution of core data capabilities, including batch and real-time pipelines, operational and analytical data stores, semantic models, and BI-ready datasets. Success requires strong technical depth across modern data tooling, sound systems thinking, and the ability to build reliable solutions in a cloud-based, regulated, high-stakes environment. The Data Engineer is expected to operate effectively in a modern engineering environment, using automation, observability, and infrastructure-as-code practices to deploy, manage, and improve data pipelines and data platforms. In parallel, this individual will help enable downstream analytics, reporting, product capabilities, and AI systems by ensuring that data is trustworthy, accessible, and fit for purpose. This is a fully remote position that offers a competitive salary range of $220,000 to $300,000, plus an annual bonus. You'll also receive our excellent benefits package, which includes medical coverage starting on day one and a company-matched 401(k). Compensation may vary based on experience, location, and other job-related factors. Responsibilities Data Pipeline Development Design, build, and maintain robust data pipelines for a wide variety of input and output sources, including internal systems, third-party platforms, files, APIs, event streams, and databases Develop scalable ETL and ELT workflows for both batch and real-time processing Ensure pipelines are reliable, testable, observable, and easy to extend as business needs evolve Build reusable data integration patterns that support growing volumes, new source systems, and downstream consumers across analytics, applications, and AI initiatives Data Platform & Storage Design and manage data architectures that support OLTP, OLAP, and reporting workloads across operational and analytical environments Build and optimize data models, warehouse schemas, and curated datasets for analytics and BI use cases Contribute to the design and operation of modern data platforms, including warehouses, lakehouses, streaming systems, and supporting orchestration frameworks Help define patterns for data storage, partitioning, performance optimization, retention, and lifecycle management Servicing-Oriented Data Modeling & Integrity Design and maintain data models that accurately reflect loan-level lifecycle events, including payment activity, balances, adjustments, and status changes Ensure consistency and reconciliation across systems where transactional, financial, and reporting data must align Identify and resolve discrepancies across source systems, and build data structures that support accurate, auditable outputs for downstream operational processes, reporting, and decisioning Cloud Deployment & Operations Deploy, operate, and improve data pipelines and data stores on major cloud platforms such as AWS, GCP, or Azure Use infrastructure-as-code, CI/CD, and automation practices to improve deployment speed, consistency, and reliability Monitor production data systems using logging, alerting, and observability tooling to proactively identify and resolve issues Support secure, resilient, and cost-conscious operation of cloud-based data infrastructure Data Quality, Reliability & Governance Implement data quality checks, validation rules, reconciliation processes, and monitoring to ensure trustworthy data across systems Establish and maintain standards for lineage, documentation, metadata, schema evolution, and operational runbooks Partner with stakeholders to improve data accessibility, consistency, and usability while maintaining appropriate cont