← Back to jobs

Lead Analytics Engineer

Eshipping
FULL_TIME Remote · US Frankfort, KY, Franklin, US USD 6667–9583 / month Posted: 2026-05-11 Until: 2026-07-10
Apply Now →
You will be redirected to the original job posting on BeBee.
Apply directly with the employer.
Job Description
Position Summary The Lead Analytics Engineer serves as the primary analytics resource embedded within the Solutions team, bridging the gap between complex data systems and business decision-making. This role combines deep technical expertise in analytics engineering with a consultative partnership approach — translating business needs into well-structured data models, building scalable data pipelines, and equipping cross-functional stakeholders with the insights and tools they need to drive outcomes. The Lead Analytics Engineer also provides technical mentorship and guidance to peers, reviewing work for accuracy, and helping elevate the team's overall data maturity. Essential Duties and Responsibilities Duties include but are not limited to the following: Design, build, and maintain scalable data models, reusable datasets, and analytics-ready assets that support reliable reporting, self-service analysis, and downstream decision-making across the organization Use SQL expertly to query, validate, and optimize data workflows, serving as a bridge between business questions, source systems, and scalable analytics solutions Write and maintain Python-based data transformation logic, including production-grade PySpark pipelines, to manipulate, validate, and operationalize complex datasets at scale Implement and manage bronze/silver/gold data modeling patterns within a Delta Lake or comparable lakehouse architecture Partner directly with the Solutions team as an embedded analytics resource, proactively identifying opportunities to leverage data for operational improvements Translate business requirements into technical specifications and deliver actionable insights to non-technical stakeholders Guide other team members on analytics engineering best practices, data modeling standards, and technical approaches Review the work of others to ensure data accuracy, consistency, and adherence to established standards Read and tune established reporting solutions to diagnose and resolve performance issues Collaborate with engineering, operations, finance, and customer success teams to understand evolving data needs Evaluate, learn, and adopt new tools, platforms, and frameworks quickly, helping the team stay effective in a fast-evolving data environment Specific Department Responsibilities Serve as point of contact between data engineering function and Solutions team, fostering an embedded partnership Proactively identify gaps in existing data models and reporting and recommend improvements Contribute to the development and evolution of the organization's data strategy, including architecture decisions, tooling, and governance standards Support the evaluation and adoption of new data technologies and platforms Create and maintain technical documentation for data models, pipelines, and processes Participate in code reviews and provide constructive, growth-oriented feedback to peers Communicate project status, technical trade-offs, and data insights to both technical and non-technical audiences Required Skills and Abilities To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skills, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions. Advanced proficiency in Python for data manipulation and analytics engineering, including writing clean, maintainable code to transform, validate, and operationalize complex datasets Expert-level proficiency in SQL, including window functions, CTEs, complex joins, MERGE operations, and query execution plan analysis, with the ability to use SQL as a bridge between business needs and scalable data solutions Familiarity with Delta Lake or comparable lakehouse technologies, including schema evolution, time travel, and medallion architecture patterns Demonstrated ability to quickly learn new tools, platforms, and frameworks, and become productive with emerging technologies in a fast-evolving data environment Demonstrated ability to translate complex business requirements into well-structured, scalable data models Excellent written and verbal communication skills, with ability to explain technical concepts to non-technical stakeholders Strong analytical and problem-solving skills with keen attention to detail Ability to work independently with minimal oversight while exercising sound judgment Comfortable mentoring others and providing technical guidance without direct management authority Ability to manage multiple priorities and