Job Description
job summary Join a leading Healthcare Analytics company as a Software/Data Engineer. You will be responsible for designing and developing high-quality, maintainable software modules for the product suite. This critical role involves collaborating in the creation and maintenance of ETL scripts, tools, queries, and applications essential for healthcare data management, data validation, statistical report generation, and program validation. Your responsibilities will include analyzing requirements to create detailed designs for implementation, conducting unit and integration testing, and resolving software-related issues for internal and external customers. Additionally, you will ensure adherence to software engineering best practices and continuously update your professional knowledge of new technologies location: Telecommute job type: Solutions salary: $54 - 55 per hour work hours: 9am to 5pm education: Bachelors responsibilities: Job Duties - Design and develop high quality, maintainable software modules for the Cotiviti, Inc. product suite Conduct unit and integration testing using appropriate methodology and techniques Analyze requirements and specifications and create detailed designs for implementation Analyze and resolve software related issues originated from internal or external customers Continuously update professional knowledge of new technologies as selected and integrated into the Cotiviti, Inc. product suite Review software engineering approach to proposed solutions to ensure adherence to best practice Complete all special projects and other duties as assigned. Must be able to perform duties with or without reasonable accommodation. Work as a team member in the creation and maintenance of ETL scripts, tools, queries, and applications used for healthcare data management, data validation, statistical report generation, and program validation. Job Requirements - Strong working knowledge of ETL, database technologies, big data and data processing skills 5+ years in Microsoft SQL Server and Relation Database developing data extraction applications. Experience in resolving performance issues with high data volume procedures in SQL and how different types of index works. 2+ years of experience developing applications using Hadoop, Spark, Impala, Hive, Python. 2+ years of experience in running, using and troubleshooting the ETL Cloudera Hadoop Ecosystem i.e. Hadoop FS, Hive, Impala, Spark, Kafka, Hue, Oozie, Yarn and Sqoop. Health care claim data knowledge is highly preferred. Experience processing large amounts of structured and unstructured data with Spark. Experience with data movement and transformation technologies. Have a good understanding of the E2E process of the application. Desired Skills & Experience - Healthcare Experience a Plus qualifications: Job Duties - Design and develop high quality, maintainable software modules for our client product suite Conduct unit and integration testing using appropriate methodology and techniques Analyze requirements and specifications and create detailed designs for implementation Analyze and resolve software related issues originated from internal or external customers Continuously update professional knowledge of new technologies as selected and integrated into the client product suite Review software engineering approach to proposed solutions to ensure adherence to best practice Complete all special projects and other duties as assigned. Must be able to perform duties with or without reasonable accommodation. Work as a team member in the creation and maintenance of ETL scripts, tools, queries, and applications used for healthcare data management, data validation, statistical report generation, and program validation. Job Requirements - Strong working knowledge of ETL, database technologies, big data and data processing skills 5+ years in Microsoft SQL Server and Relation Database developing data extraction applications. Experience in resolving performance issues with high data volume procedures in SQL and how different types of index works. 2+ years of experience developing applications using Hadoop, Spark, Impala, Hive, Python. 2+ years of experience in running, using and troubleshooting the ETL Cloudera Hadoop Ecosystem i.e. Hadoop FS, Hive, Impala, Spark, Kafka, Hue, Oozie, Yarn and Sqoop. Health care claim data knowledge is highly preferred. Experience processing large amounts of structured and unstructured data with Spark. Experience with data movement and transformation technologies. Have a good understanding of the E2E process of the application. D